Jan 23 18:29:37.052598 kernel: Linux version 6.12.66-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.1_p20250801 p4) 14.3.1 20250801, GNU ld (Gentoo 2.45 p3) 2.45.0) #1 SMP PREEMPT_DYNAMIC Fri Jan 23 15:50:57 -00 2026 Jan 23 18:29:37.052635 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack verity.usrhash=ee2a61adbfdca0d8850a6d1564f6a5daa8e67e4645be01ed76a79270fe7c1051 Jan 23 18:29:37.052644 kernel: BIOS-provided physical RAM map: Jan 23 18:29:37.052650 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ffff] usable Jan 23 18:29:37.052655 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000007fffff] usable Jan 23 18:29:37.052660 kernel: BIOS-e820: [mem 0x0000000000800000-0x0000000000807fff] ACPI NVS Jan 23 18:29:37.052669 kernel: BIOS-e820: [mem 0x0000000000808000-0x000000000080afff] usable Jan 23 18:29:37.052675 kernel: BIOS-e820: [mem 0x000000000080b000-0x000000000080bfff] ACPI NVS Jan 23 18:29:37.052680 kernel: BIOS-e820: [mem 0x000000000080c000-0x0000000000810fff] usable Jan 23 18:29:37.052686 kernel: BIOS-e820: [mem 0x0000000000811000-0x00000000008fffff] ACPI NVS Jan 23 18:29:37.052691 kernel: BIOS-e820: [mem 0x0000000000900000-0x000000007e93efff] usable Jan 23 18:29:37.052696 kernel: BIOS-e820: [mem 0x000000007e93f000-0x000000007e9fffff] reserved Jan 23 18:29:37.052702 kernel: BIOS-e820: [mem 0x000000007ea00000-0x000000007ec70fff] usable Jan 23 18:29:37.052707 kernel: BIOS-e820: [mem 0x000000007ec71000-0x000000007ed84fff] reserved Jan 23 18:29:37.052716 kernel: BIOS-e820: [mem 0x000000007ed85000-0x000000007f8ecfff] usable Jan 23 18:29:37.052722 kernel: BIOS-e820: [mem 0x000000007f8ed000-0x000000007fb6cfff] reserved Jan 23 18:29:37.052728 kernel: BIOS-e820: [mem 0x000000007fb6d000-0x000000007fb7efff] ACPI data Jan 23 18:29:37.052734 kernel: BIOS-e820: [mem 0x000000007fb7f000-0x000000007fbfefff] ACPI NVS Jan 23 18:29:37.052739 kernel: BIOS-e820: [mem 0x000000007fbff000-0x000000007feaefff] usable Jan 23 18:29:37.052745 kernel: BIOS-e820: [mem 0x000000007feaf000-0x000000007feb2fff] reserved Jan 23 18:29:37.052753 kernel: BIOS-e820: [mem 0x000000007feb3000-0x000000007feb4fff] ACPI NVS Jan 23 18:29:37.052758 kernel: BIOS-e820: [mem 0x000000007feb5000-0x000000007feebfff] usable Jan 23 18:29:37.052764 kernel: BIOS-e820: [mem 0x000000007feec000-0x000000007ff6ffff] reserved Jan 23 18:29:37.052770 kernel: BIOS-e820: [mem 0x000000007ff70000-0x000000007fffffff] ACPI NVS Jan 23 18:29:37.052775 kernel: BIOS-e820: [mem 0x00000000e0000000-0x00000000efffffff] reserved Jan 23 18:29:37.052781 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Jan 23 18:29:37.052787 kernel: BIOS-e820: [mem 0x00000000ffc00000-0x00000000ffffffff] reserved Jan 23 18:29:37.052793 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000017fffffff] usable Jan 23 18:29:37.052798 kernel: NX (Execute Disable) protection: active Jan 23 18:29:37.052804 kernel: APIC: Static calls initialized Jan 23 18:29:37.052810 kernel: e820: update [mem 0x7df7f018-0x7df88a57] usable ==> usable Jan 23 18:29:37.052818 kernel: e820: update [mem 0x7df57018-0x7df7e457] usable ==> usable Jan 23 18:29:37.052824 kernel: extended physical RAM map: Jan 23 18:29:37.052830 kernel: reserve setup_data: [mem 0x0000000000000000-0x000000000009ffff] usable Jan 23 18:29:37.052835 kernel: reserve setup_data: [mem 0x0000000000100000-0x00000000007fffff] usable Jan 23 18:29:37.052841 kernel: reserve setup_data: [mem 0x0000000000800000-0x0000000000807fff] ACPI NVS Jan 23 18:29:37.052847 kernel: reserve setup_data: [mem 0x0000000000808000-0x000000000080afff] usable Jan 23 18:29:37.052853 kernel: reserve setup_data: [mem 0x000000000080b000-0x000000000080bfff] ACPI NVS Jan 23 18:29:37.052858 kernel: reserve setup_data: [mem 0x000000000080c000-0x0000000000810fff] usable Jan 23 18:29:37.052864 kernel: reserve setup_data: [mem 0x0000000000811000-0x00000000008fffff] ACPI NVS Jan 23 18:29:37.052874 kernel: reserve setup_data: [mem 0x0000000000900000-0x000000007df57017] usable Jan 23 18:29:37.052880 kernel: reserve setup_data: [mem 0x000000007df57018-0x000000007df7e457] usable Jan 23 18:29:37.052887 kernel: reserve setup_data: [mem 0x000000007df7e458-0x000000007df7f017] usable Jan 23 18:29:37.052893 kernel: reserve setup_data: [mem 0x000000007df7f018-0x000000007df88a57] usable Jan 23 18:29:37.052900 kernel: reserve setup_data: [mem 0x000000007df88a58-0x000000007e93efff] usable Jan 23 18:29:37.052907 kernel: reserve setup_data: [mem 0x000000007e93f000-0x000000007e9fffff] reserved Jan 23 18:29:37.052913 kernel: reserve setup_data: [mem 0x000000007ea00000-0x000000007ec70fff] usable Jan 23 18:29:37.052919 kernel: reserve setup_data: [mem 0x000000007ec71000-0x000000007ed84fff] reserved Jan 23 18:29:37.052925 kernel: reserve setup_data: [mem 0x000000007ed85000-0x000000007f8ecfff] usable Jan 23 18:29:37.052931 kernel: reserve setup_data: [mem 0x000000007f8ed000-0x000000007fb6cfff] reserved Jan 23 18:29:37.052937 kernel: reserve setup_data: [mem 0x000000007fb6d000-0x000000007fb7efff] ACPI data Jan 23 18:29:37.052943 kernel: reserve setup_data: [mem 0x000000007fb7f000-0x000000007fbfefff] ACPI NVS Jan 23 18:29:37.052949 kernel: reserve setup_data: [mem 0x000000007fbff000-0x000000007feaefff] usable Jan 23 18:29:37.052955 kernel: reserve setup_data: [mem 0x000000007feaf000-0x000000007feb2fff] reserved Jan 23 18:29:37.052961 kernel: reserve setup_data: [mem 0x000000007feb3000-0x000000007feb4fff] ACPI NVS Jan 23 18:29:37.052969 kernel: reserve setup_data: [mem 0x000000007feb5000-0x000000007feebfff] usable Jan 23 18:29:37.052975 kernel: reserve setup_data: [mem 0x000000007feec000-0x000000007ff6ffff] reserved Jan 23 18:29:37.052981 kernel: reserve setup_data: [mem 0x000000007ff70000-0x000000007fffffff] ACPI NVS Jan 23 18:29:37.052987 kernel: reserve setup_data: [mem 0x00000000e0000000-0x00000000efffffff] reserved Jan 23 18:29:37.052993 kernel: reserve setup_data: [mem 0x00000000feffc000-0x00000000feffffff] reserved Jan 23 18:29:37.052999 kernel: reserve setup_data: [mem 0x00000000ffc00000-0x00000000ffffffff] reserved Jan 23 18:29:37.053005 kernel: reserve setup_data: [mem 0x0000000100000000-0x000000017fffffff] usable Jan 23 18:29:37.053012 kernel: efi: EFI v2.7 by EDK II Jan 23 18:29:37.053018 kernel: efi: SMBIOS=0x7f972000 ACPI=0x7fb7e000 ACPI 2.0=0x7fb7e014 MEMATTR=0x7dfd8018 RNG=0x7fb72018 Jan 23 18:29:37.053024 kernel: random: crng init done Jan 23 18:29:37.053030 kernel: efi: Remove mem139: MMIO range=[0xffc00000-0xffffffff] (4MB) from e820 map Jan 23 18:29:37.053038 kernel: e820: remove [mem 0xffc00000-0xffffffff] reserved Jan 23 18:29:37.053044 kernel: secureboot: Secure boot disabled Jan 23 18:29:37.053050 kernel: SMBIOS 2.8 present. Jan 23 18:29:37.053056 kernel: DMI: STACKIT Cloud OpenStack Nova/Standard PC (Q35 + ICH9, 2009), BIOS unknown 02/02/2022 Jan 23 18:29:37.053063 kernel: DMI: Memory slots populated: 1/1 Jan 23 18:29:37.053069 kernel: Hypervisor detected: KVM Jan 23 18:29:37.053075 kernel: last_pfn = 0x7feec max_arch_pfn = 0x10000000000 Jan 23 18:29:37.053081 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Jan 23 18:29:37.053087 kernel: kvm-clock: using sched offset of 4912566833 cycles Jan 23 18:29:37.053093 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Jan 23 18:29:37.053102 kernel: tsc: Detected 2294.586 MHz processor Jan 23 18:29:37.053109 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Jan 23 18:29:37.053116 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Jan 23 18:29:37.053123 kernel: last_pfn = 0x180000 max_arch_pfn = 0x10000000000 Jan 23 18:29:37.053130 kernel: MTRR map: 4 entries (2 fixed + 2 variable; max 18), built from 8 variable MTRRs Jan 23 18:29:37.053137 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Jan 23 18:29:37.053144 kernel: last_pfn = 0x7feec max_arch_pfn = 0x10000000000 Jan 23 18:29:37.053151 kernel: Using GB pages for direct mapping Jan 23 18:29:37.053160 kernel: ACPI: Early table checksum verification disabled Jan 23 18:29:37.053167 kernel: ACPI: RSDP 0x000000007FB7E014 000024 (v02 BOCHS ) Jan 23 18:29:37.053174 kernel: ACPI: XSDT 0x000000007FB7D0E8 00004C (v01 BOCHS BXPC 00000001 01000013) Jan 23 18:29:37.053181 kernel: ACPI: FACP 0x000000007FB77000 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jan 23 18:29:37.053188 kernel: ACPI: DSDT 0x000000007FB78000 00423C (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 23 18:29:37.053195 kernel: ACPI: FACS 0x000000007FBDD000 000040 Jan 23 18:29:37.053202 kernel: ACPI: APIC 0x000000007FB76000 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jan 23 18:29:37.053210 kernel: ACPI: MCFG 0x000000007FB75000 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 23 18:29:37.053217 kernel: ACPI: WAET 0x000000007FB74000 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 23 18:29:37.053224 kernel: ACPI: BGRT 0x000000007FB73000 000038 (v01 INTEL EDK2 00000002 01000013) Jan 23 18:29:37.053231 kernel: ACPI: Reserving FACP table memory at [mem 0x7fb77000-0x7fb770f3] Jan 23 18:29:37.053238 kernel: ACPI: Reserving DSDT table memory at [mem 0x7fb78000-0x7fb7c23b] Jan 23 18:29:37.053245 kernel: ACPI: Reserving FACS table memory at [mem 0x7fbdd000-0x7fbdd03f] Jan 23 18:29:37.053252 kernel: ACPI: Reserving APIC table memory at [mem 0x7fb76000-0x7fb7607f] Jan 23 18:29:37.053261 kernel: ACPI: Reserving MCFG table memory at [mem 0x7fb75000-0x7fb7503b] Jan 23 18:29:37.053268 kernel: ACPI: Reserving WAET table memory at [mem 0x7fb74000-0x7fb74027] Jan 23 18:29:37.053275 kernel: ACPI: Reserving BGRT table memory at [mem 0x7fb73000-0x7fb73037] Jan 23 18:29:37.053281 kernel: No NUMA configuration found Jan 23 18:29:37.053288 kernel: Faking a node at [mem 0x0000000000000000-0x000000017fffffff] Jan 23 18:29:37.053295 kernel: NODE_DATA(0) allocated [mem 0x17fff8dc0-0x17fffffff] Jan 23 18:29:37.053302 kernel: Zone ranges: Jan 23 18:29:37.053309 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Jan 23 18:29:37.053317 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] Jan 23 18:29:37.053324 kernel: Normal [mem 0x0000000100000000-0x000000017fffffff] Jan 23 18:29:37.053331 kernel: Device empty Jan 23 18:29:37.053338 kernel: Movable zone start for each node Jan 23 18:29:37.053344 kernel: Early memory node ranges Jan 23 18:29:37.053351 kernel: node 0: [mem 0x0000000000001000-0x000000000009ffff] Jan 23 18:29:37.053358 kernel: node 0: [mem 0x0000000000100000-0x00000000007fffff] Jan 23 18:29:37.053365 kernel: node 0: [mem 0x0000000000808000-0x000000000080afff] Jan 23 18:29:37.053374 kernel: node 0: [mem 0x000000000080c000-0x0000000000810fff] Jan 23 18:29:37.053380 kernel: node 0: [mem 0x0000000000900000-0x000000007e93efff] Jan 23 18:29:37.053387 kernel: node 0: [mem 0x000000007ea00000-0x000000007ec70fff] Jan 23 18:29:37.053394 kernel: node 0: [mem 0x000000007ed85000-0x000000007f8ecfff] Jan 23 18:29:37.053408 kernel: node 0: [mem 0x000000007fbff000-0x000000007feaefff] Jan 23 18:29:37.053416 kernel: node 0: [mem 0x000000007feb5000-0x000000007feebfff] Jan 23 18:29:37.053423 kernel: node 0: [mem 0x0000000100000000-0x000000017fffffff] Jan 23 18:29:37.053431 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000017fffffff] Jan 23 18:29:37.053438 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Jan 23 18:29:37.053446 kernel: On node 0, zone DMA: 96 pages in unavailable ranges Jan 23 18:29:37.053455 kernel: On node 0, zone DMA: 8 pages in unavailable ranges Jan 23 18:29:37.053463 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Jan 23 18:29:37.053470 kernel: On node 0, zone DMA: 239 pages in unavailable ranges Jan 23 18:29:37.053478 kernel: On node 0, zone DMA32: 193 pages in unavailable ranges Jan 23 18:29:37.053487 kernel: On node 0, zone DMA32: 276 pages in unavailable ranges Jan 23 18:29:37.053495 kernel: On node 0, zone DMA32: 786 pages in unavailable ranges Jan 23 18:29:37.053503 kernel: On node 0, zone DMA32: 6 pages in unavailable ranges Jan 23 18:29:37.053510 kernel: On node 0, zone Normal: 276 pages in unavailable ranges Jan 23 18:29:37.054287 kernel: ACPI: PM-Timer IO Port: 0x608 Jan 23 18:29:37.054297 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Jan 23 18:29:37.054306 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Jan 23 18:29:37.054319 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Jan 23 18:29:37.054328 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Jan 23 18:29:37.054338 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Jan 23 18:29:37.054347 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Jan 23 18:29:37.054356 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Jan 23 18:29:37.054365 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Jan 23 18:29:37.054374 kernel: TSC deadline timer available Jan 23 18:29:37.054384 kernel: CPU topo: Max. logical packages: 2 Jan 23 18:29:37.054393 kernel: CPU topo: Max. logical dies: 2 Jan 23 18:29:37.054402 kernel: CPU topo: Max. dies per package: 1 Jan 23 18:29:37.054410 kernel: CPU topo: Max. threads per core: 1 Jan 23 18:29:37.054418 kernel: CPU topo: Num. cores per package: 1 Jan 23 18:29:37.054426 kernel: CPU topo: Num. threads per package: 1 Jan 23 18:29:37.054435 kernel: CPU topo: Allowing 2 present CPUs plus 0 hotplug CPUs Jan 23 18:29:37.054443 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Jan 23 18:29:37.054453 kernel: kvm-guest: KVM setup pv remote TLB flush Jan 23 18:29:37.054462 kernel: kvm-guest: setup PV sched yield Jan 23 18:29:37.054470 kernel: [mem 0x80000000-0xdfffffff] available for PCI devices Jan 23 18:29:37.054479 kernel: Booting paravirtualized kernel on KVM Jan 23 18:29:37.054488 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Jan 23 18:29:37.054497 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 Jan 23 18:29:37.054505 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u1048576 Jan 23 18:29:37.054515 kernel: pcpu-alloc: s207832 r8192 d29736 u1048576 alloc=1*2097152 Jan 23 18:29:37.054524 kernel: pcpu-alloc: [0] 0 1 Jan 23 18:29:37.054532 kernel: kvm-guest: PV spinlocks enabled Jan 23 18:29:37.054541 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Jan 23 18:29:37.054550 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack verity.usrhash=ee2a61adbfdca0d8850a6d1564f6a5daa8e67e4645be01ed76a79270fe7c1051 Jan 23 18:29:37.054559 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Jan 23 18:29:37.054566 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Jan 23 18:29:37.054574 kernel: Fallback order for Node 0: 0 Jan 23 18:29:37.054581 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1046694 Jan 23 18:29:37.054588 kernel: Policy zone: Normal Jan 23 18:29:37.054595 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Jan 23 18:29:37.054608 kernel: software IO TLB: area num 2. Jan 23 18:29:37.054615 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Jan 23 18:29:37.054622 kernel: ftrace: allocating 40097 entries in 157 pages Jan 23 18:29:37.054631 kernel: ftrace: allocated 157 pages with 5 groups Jan 23 18:29:37.054638 kernel: Dynamic Preempt: voluntary Jan 23 18:29:37.054645 kernel: rcu: Preemptible hierarchical RCU implementation. Jan 23 18:29:37.054652 kernel: rcu: RCU event tracing is enabled. Jan 23 18:29:37.054659 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Jan 23 18:29:37.054666 kernel: Trampoline variant of Tasks RCU enabled. Jan 23 18:29:37.054673 kernel: Rude variant of Tasks RCU enabled. Jan 23 18:29:37.054680 kernel: Tracing variant of Tasks RCU enabled. Jan 23 18:29:37.054688 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Jan 23 18:29:37.054695 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Jan 23 18:29:37.054702 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jan 23 18:29:37.054709 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jan 23 18:29:37.054716 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jan 23 18:29:37.054723 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 16 Jan 23 18:29:37.054730 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Jan 23 18:29:37.054739 kernel: Console: colour dummy device 80x25 Jan 23 18:29:37.054746 kernel: printk: legacy console [tty0] enabled Jan 23 18:29:37.054753 kernel: printk: legacy console [ttyS0] enabled Jan 23 18:29:37.054760 kernel: ACPI: Core revision 20240827 Jan 23 18:29:37.054767 kernel: APIC: Switch to symmetric I/O mode setup Jan 23 18:29:37.054774 kernel: x2apic enabled Jan 23 18:29:37.054781 kernel: APIC: Switched APIC routing to: physical x2apic Jan 23 18:29:37.054790 kernel: kvm-guest: APIC: send_IPI_mask() replaced with kvm_send_ipi_mask() Jan 23 18:29:37.054797 kernel: kvm-guest: APIC: send_IPI_mask_allbutself() replaced with kvm_send_ipi_mask_allbutself() Jan 23 18:29:37.054804 kernel: kvm-guest: setup PV IPIs Jan 23 18:29:37.054811 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x21133ac8314, max_idle_ns: 440795303427 ns Jan 23 18:29:37.054818 kernel: Calibrating delay loop (skipped) preset value.. 4589.17 BogoMIPS (lpj=2294586) Jan 23 18:29:37.054826 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Jan 23 18:29:37.054833 kernel: Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 Jan 23 18:29:37.054842 kernel: Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 Jan 23 18:29:37.054848 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Jan 23 18:29:37.054854 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on syscall and VM exit Jan 23 18:29:37.054861 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Jan 23 18:29:37.054867 kernel: Spectre V2 : Spectre v2 / PBRSB-eIBRS: Retire a single CALL on VMEXIT Jan 23 18:29:37.054874 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Jan 23 18:29:37.054881 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Jan 23 18:29:37.054887 kernel: TAA: Mitigation: Clear CPU buffers Jan 23 18:29:37.054893 kernel: MMIO Stale Data: Mitigation: Clear CPU buffers Jan 23 18:29:37.054900 kernel: active return thunk: its_return_thunk Jan 23 18:29:37.054907 kernel: ITS: Mitigation: Aligned branch/return thunks Jan 23 18:29:37.054914 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Jan 23 18:29:37.054921 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Jan 23 18:29:37.054928 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Jan 23 18:29:37.054934 kernel: x86/fpu: Supporting XSAVE feature 0x020: 'AVX-512 opmask' Jan 23 18:29:37.054941 kernel: x86/fpu: Supporting XSAVE feature 0x040: 'AVX-512 Hi256' Jan 23 18:29:37.054947 kernel: x86/fpu: Supporting XSAVE feature 0x080: 'AVX-512 ZMM_Hi256' Jan 23 18:29:37.054954 kernel: x86/fpu: Supporting XSAVE feature 0x200: 'Protection Keys User registers' Jan 23 18:29:37.054960 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Jan 23 18:29:37.054967 kernel: x86/fpu: xstate_offset[5]: 832, xstate_sizes[5]: 64 Jan 23 18:29:37.054975 kernel: x86/fpu: xstate_offset[6]: 896, xstate_sizes[6]: 512 Jan 23 18:29:37.054981 kernel: x86/fpu: xstate_offset[7]: 1408, xstate_sizes[7]: 1024 Jan 23 18:29:37.054987 kernel: x86/fpu: xstate_offset[9]: 2432, xstate_sizes[9]: 8 Jan 23 18:29:37.054994 kernel: x86/fpu: Enabled xstate features 0x2e7, context size is 2440 bytes, using 'compacted' format. Jan 23 18:29:37.055000 kernel: Freeing SMP alternatives memory: 32K Jan 23 18:29:37.055007 kernel: pid_max: default: 32768 minimum: 301 Jan 23 18:29:37.055013 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Jan 23 18:29:37.055020 kernel: landlock: Up and running. Jan 23 18:29:37.055026 kernel: SELinux: Initializing. Jan 23 18:29:37.055033 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Jan 23 18:29:37.055039 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Jan 23 18:29:37.055046 kernel: smpboot: CPU0: Intel(R) Xeon(R) Silver 4316 CPU @ 2.30GHz (family: 0x6, model: 0x6a, stepping: 0x6) Jan 23 18:29:37.055054 kernel: Performance Events: PEBS fmt0-, Icelake events, full-width counters, Intel PMU driver. Jan 23 18:29:37.055061 kernel: ... version: 2 Jan 23 18:29:37.055068 kernel: ... bit width: 48 Jan 23 18:29:37.055075 kernel: ... generic registers: 8 Jan 23 18:29:37.055082 kernel: ... value mask: 0000ffffffffffff Jan 23 18:29:37.055089 kernel: ... max period: 00007fffffffffff Jan 23 18:29:37.055096 kernel: ... fixed-purpose events: 3 Jan 23 18:29:37.055104 kernel: ... event mask: 00000007000000ff Jan 23 18:29:37.055111 kernel: signal: max sigframe size: 3632 Jan 23 18:29:37.055118 kernel: rcu: Hierarchical SRCU implementation. Jan 23 18:29:37.055125 kernel: rcu: Max phase no-delay instances is 400. Jan 23 18:29:37.055132 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Jan 23 18:29:37.055139 kernel: smp: Bringing up secondary CPUs ... Jan 23 18:29:37.055146 kernel: smpboot: x86: Booting SMP configuration: Jan 23 18:29:37.055154 kernel: .... node #0, CPUs: #1 Jan 23 18:29:37.055161 kernel: smp: Brought up 1 node, 2 CPUs Jan 23 18:29:37.055168 kernel: smpboot: Total of 2 processors activated (9178.34 BogoMIPS) Jan 23 18:29:37.055175 kernel: Memory: 3969768K/4186776K available (14336K kernel code, 2445K rwdata, 31636K rodata, 15532K init, 2508K bss, 212128K reserved, 0K cma-reserved) Jan 23 18:29:37.055182 kernel: devtmpfs: initialized Jan 23 18:29:37.055189 kernel: x86/mm: Memory block size: 128MB Jan 23 18:29:37.055196 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x00800000-0x00807fff] (32768 bytes) Jan 23 18:29:37.055204 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x0080b000-0x0080bfff] (4096 bytes) Jan 23 18:29:37.055211 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x00811000-0x008fffff] (978944 bytes) Jan 23 18:29:37.055218 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7fb7f000-0x7fbfefff] (524288 bytes) Jan 23 18:29:37.056164 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7feb3000-0x7feb4fff] (8192 bytes) Jan 23 18:29:37.056172 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7ff70000-0x7fffffff] (589824 bytes) Jan 23 18:29:37.056179 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Jan 23 18:29:37.056186 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Jan 23 18:29:37.056195 kernel: pinctrl core: initialized pinctrl subsystem Jan 23 18:29:37.056202 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Jan 23 18:29:37.056209 kernel: audit: initializing netlink subsys (disabled) Jan 23 18:29:37.056216 kernel: audit: type=2000 audit(1769192974.729:1): state=initialized audit_enabled=0 res=1 Jan 23 18:29:37.056223 kernel: thermal_sys: Registered thermal governor 'step_wise' Jan 23 18:29:37.056230 kernel: thermal_sys: Registered thermal governor 'user_space' Jan 23 18:29:37.056237 kernel: cpuidle: using governor menu Jan 23 18:29:37.056246 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Jan 23 18:29:37.056253 kernel: dca service started, version 1.12.1 Jan 23 18:29:37.056260 kernel: PCI: ECAM [mem 0xe0000000-0xefffffff] (base 0xe0000000) for domain 0000 [bus 00-ff] Jan 23 18:29:37.056266 kernel: PCI: Using configuration type 1 for base access Jan 23 18:29:37.056273 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Jan 23 18:29:37.056280 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Jan 23 18:29:37.056287 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Jan 23 18:29:37.056296 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Jan 23 18:29:37.056303 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Jan 23 18:29:37.056309 kernel: ACPI: Added _OSI(Module Device) Jan 23 18:29:37.056316 kernel: ACPI: Added _OSI(Processor Device) Jan 23 18:29:37.056323 kernel: ACPI: Added _OSI(Processor Aggregator Device) Jan 23 18:29:37.056330 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Jan 23 18:29:37.056337 kernel: ACPI: Interpreter enabled Jan 23 18:29:37.056344 kernel: ACPI: PM: (supports S0 S3 S5) Jan 23 18:29:37.056352 kernel: ACPI: Using IOAPIC for interrupt routing Jan 23 18:29:37.056359 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Jan 23 18:29:37.056366 kernel: PCI: Using E820 reservations for host bridge windows Jan 23 18:29:37.056373 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Jan 23 18:29:37.056380 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Jan 23 18:29:37.056533 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Jan 23 18:29:37.057731 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] Jan 23 18:29:37.057820 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] Jan 23 18:29:37.057830 kernel: PCI host bridge to bus 0000:00 Jan 23 18:29:37.057918 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Jan 23 18:29:37.057994 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Jan 23 18:29:37.058068 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Jan 23 18:29:37.058144 kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xdfffffff window] Jan 23 18:29:37.058218 kernel: pci_bus 0000:00: root bus resource [mem 0xf0000000-0xfebfffff window] Jan 23 18:29:37.058291 kernel: pci_bus 0000:00: root bus resource [mem 0x380000000000-0x38e800003fff window] Jan 23 18:29:37.058364 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Jan 23 18:29:37.058461 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 conventional PCI endpoint Jan 23 18:29:37.058556 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint Jan 23 18:29:37.060665 kernel: pci 0000:00:01.0: BAR 0 [mem 0x80000000-0x807fffff pref] Jan 23 18:29:37.060765 kernel: pci 0000:00:01.0: BAR 2 [mem 0x38e800000000-0x38e800003fff 64bit pref] Jan 23 18:29:37.060850 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8439e000-0x8439efff] Jan 23 18:29:37.060933 kernel: pci 0000:00:01.0: ROM [mem 0xffff0000-0xffffffff pref] Jan 23 18:29:37.061016 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Jan 23 18:29:37.061114 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 18:29:37.061202 kernel: pci 0000:00:02.0: BAR 0 [mem 0x8439d000-0x8439dfff] Jan 23 18:29:37.061293 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Jan 23 18:29:37.061385 kernel: pci 0000:00:02.0: bridge window [io 0x6000-0x6fff] Jan 23 18:29:37.061477 kernel: pci 0000:00:02.0: bridge window [mem 0x84000000-0x842fffff] Jan 23 18:29:37.061581 kernel: pci 0000:00:02.0: bridge window [mem 0x380000000000-0x3807ffffffff 64bit pref] Jan 23 18:29:37.061694 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 18:29:37.061780 kernel: pci 0000:00:02.1: BAR 0 [mem 0x8439c000-0x8439cfff] Jan 23 18:29:37.061870 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Jan 23 18:29:37.061961 kernel: pci 0000:00:02.1: bridge window [mem 0x83e00000-0x83ffffff] Jan 23 18:29:37.062051 kernel: pci 0000:00:02.1: bridge window [mem 0x380800000000-0x380fffffffff 64bit pref] Jan 23 18:29:37.062147 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 18:29:37.062232 kernel: pci 0000:00:02.2: BAR 0 [mem 0x8439b000-0x8439bfff] Jan 23 18:29:37.062316 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Jan 23 18:29:37.062400 kernel: pci 0000:00:02.2: bridge window [mem 0x83c00000-0x83dfffff] Jan 23 18:29:37.062484 kernel: pci 0000:00:02.2: bridge window [mem 0x381000000000-0x3817ffffffff 64bit pref] Jan 23 18:29:37.062573 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 18:29:37.064712 kernel: pci 0000:00:02.3: BAR 0 [mem 0x8439a000-0x8439afff] Jan 23 18:29:37.064811 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Jan 23 18:29:37.064898 kernel: pci 0000:00:02.3: bridge window [mem 0x83a00000-0x83bfffff] Jan 23 18:29:37.064982 kernel: pci 0000:00:02.3: bridge window [mem 0x381800000000-0x381fffffffff 64bit pref] Jan 23 18:29:37.065072 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 18:29:37.065156 kernel: pci 0000:00:02.4: BAR 0 [mem 0x84399000-0x84399fff] Jan 23 18:29:37.065243 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Jan 23 18:29:37.065328 kernel: pci 0000:00:02.4: bridge window [mem 0x83800000-0x839fffff] Jan 23 18:29:37.065412 kernel: pci 0000:00:02.4: bridge window [mem 0x382000000000-0x3827ffffffff 64bit pref] Jan 23 18:29:37.065505 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 18:29:37.065627 kernel: pci 0000:00:02.5: BAR 0 [mem 0x84398000-0x84398fff] Jan 23 18:29:37.065721 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Jan 23 18:29:37.065809 kernel: pci 0000:00:02.5: bridge window [mem 0x83600000-0x837fffff] Jan 23 18:29:37.065891 kernel: pci 0000:00:02.5: bridge window [mem 0x382800000000-0x382fffffffff 64bit pref] Jan 23 18:29:37.065980 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 18:29:37.066066 kernel: pci 0000:00:02.6: BAR 0 [mem 0x84397000-0x84397fff] Jan 23 18:29:37.066152 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Jan 23 18:29:37.066238 kernel: pci 0000:00:02.6: bridge window [mem 0x83400000-0x835fffff] Jan 23 18:29:37.066322 kernel: pci 0000:00:02.6: bridge window [mem 0x383000000000-0x3837ffffffff 64bit pref] Jan 23 18:29:37.066410 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 18:29:37.066493 kernel: pci 0000:00:02.7: BAR 0 [mem 0x84396000-0x84396fff] Jan 23 18:29:37.066577 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Jan 23 18:29:37.067660 kernel: pci 0000:00:02.7: bridge window [mem 0x83200000-0x833fffff] Jan 23 18:29:37.067745 kernel: pci 0000:00:02.7: bridge window [mem 0x383800000000-0x383fffffffff 64bit pref] Jan 23 18:29:37.067834 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 18:29:37.067918 kernel: pci 0000:00:03.0: BAR 0 [mem 0x84395000-0x84395fff] Jan 23 18:29:37.068002 kernel: pci 0000:00:03.0: PCI bridge to [bus 0a] Jan 23 18:29:37.068085 kernel: pci 0000:00:03.0: bridge window [mem 0x83000000-0x831fffff] Jan 23 18:29:37.068169 kernel: pci 0000:00:03.0: bridge window [mem 0x384000000000-0x3847ffffffff 64bit pref] Jan 23 18:29:37.068265 kernel: pci 0000:00:03.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 18:29:37.068366 kernel: pci 0000:00:03.1: BAR 0 [mem 0x84394000-0x84394fff] Jan 23 18:29:37.068457 kernel: pci 0000:00:03.1: PCI bridge to [bus 0b] Jan 23 18:29:37.068546 kernel: pci 0000:00:03.1: bridge window [mem 0x82e00000-0x82ffffff] Jan 23 18:29:37.069650 kernel: pci 0000:00:03.1: bridge window [mem 0x384800000000-0x384fffffffff 64bit pref] Jan 23 18:29:37.069754 kernel: pci 0000:00:03.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 18:29:37.069851 kernel: pci 0000:00:03.2: BAR 0 [mem 0x84393000-0x84393fff] Jan 23 18:29:37.069943 kernel: pci 0000:00:03.2: PCI bridge to [bus 0c] Jan 23 18:29:37.070038 kernel: pci 0000:00:03.2: bridge window [mem 0x82c00000-0x82dfffff] Jan 23 18:29:37.070130 kernel: pci 0000:00:03.2: bridge window [mem 0x385000000000-0x3857ffffffff 64bit pref] Jan 23 18:29:37.070227 kernel: pci 0000:00:03.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 18:29:37.070322 kernel: pci 0000:00:03.3: BAR 0 [mem 0x84392000-0x84392fff] Jan 23 18:29:37.070415 kernel: pci 0000:00:03.3: PCI bridge to [bus 0d] Jan 23 18:29:37.070505 kernel: pci 0000:00:03.3: bridge window [mem 0x82a00000-0x82bfffff] Jan 23 18:29:37.070594 kernel: pci 0000:00:03.3: bridge window [mem 0x385800000000-0x385fffffffff 64bit pref] Jan 23 18:29:37.070697 kernel: pci 0000:00:03.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 18:29:37.070790 kernel: pci 0000:00:03.4: BAR 0 [mem 0x84391000-0x84391fff] Jan 23 18:29:37.070882 kernel: pci 0000:00:03.4: PCI bridge to [bus 0e] Jan 23 18:29:37.070973 kernel: pci 0000:00:03.4: bridge window [mem 0x82800000-0x829fffff] Jan 23 18:29:37.071064 kernel: pci 0000:00:03.4: bridge window [mem 0x386000000000-0x3867ffffffff 64bit pref] Jan 23 18:29:37.071157 kernel: pci 0000:00:03.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 18:29:37.071250 kernel: pci 0000:00:03.5: BAR 0 [mem 0x84390000-0x84390fff] Jan 23 18:29:37.071338 kernel: pci 0000:00:03.5: PCI bridge to [bus 0f] Jan 23 18:29:37.071426 kernel: pci 0000:00:03.5: bridge window [mem 0x82600000-0x827fffff] Jan 23 18:29:37.071515 kernel: pci 0000:00:03.5: bridge window [mem 0x386800000000-0x386fffffffff 64bit pref] Jan 23 18:29:37.075975 kernel: pci 0000:00:03.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 18:29:37.076102 kernel: pci 0000:00:03.6: BAR 0 [mem 0x8438f000-0x8438ffff] Jan 23 18:29:37.076201 kernel: pci 0000:00:03.6: PCI bridge to [bus 10] Jan 23 18:29:37.076291 kernel: pci 0000:00:03.6: bridge window [mem 0x82400000-0x825fffff] Jan 23 18:29:37.076383 kernel: pci 0000:00:03.6: bridge window [mem 0x387000000000-0x3877ffffffff 64bit pref] Jan 23 18:29:37.076482 kernel: pci 0000:00:03.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 18:29:37.076575 kernel: pci 0000:00:03.7: BAR 0 [mem 0x8438e000-0x8438efff] Jan 23 18:29:37.076679 kernel: pci 0000:00:03.7: PCI bridge to [bus 11] Jan 23 18:29:37.076773 kernel: pci 0000:00:03.7: bridge window [mem 0x82200000-0x823fffff] Jan 23 18:29:37.076865 kernel: pci 0000:00:03.7: bridge window [mem 0x387800000000-0x387fffffffff 64bit pref] Jan 23 18:29:37.076963 kernel: pci 0000:00:04.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 18:29:37.077055 kernel: pci 0000:00:04.0: BAR 0 [mem 0x8438d000-0x8438dfff] Jan 23 18:29:37.077147 kernel: pci 0000:00:04.0: PCI bridge to [bus 12] Jan 23 18:29:37.077239 kernel: pci 0000:00:04.0: bridge window [mem 0x82000000-0x821fffff] Jan 23 18:29:37.077333 kernel: pci 0000:00:04.0: bridge window [mem 0x388000000000-0x3887ffffffff 64bit pref] Jan 23 18:29:37.077430 kernel: pci 0000:00:04.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 18:29:37.077522 kernel: pci 0000:00:04.1: BAR 0 [mem 0x8438c000-0x8438cfff] Jan 23 18:29:37.077642 kernel: pci 0000:00:04.1: PCI bridge to [bus 13] Jan 23 18:29:37.077736 kernel: pci 0000:00:04.1: bridge window [mem 0x81e00000-0x81ffffff] Jan 23 18:29:37.077827 kernel: pci 0000:00:04.1: bridge window [mem 0x388800000000-0x388fffffffff 64bit pref] Jan 23 18:29:37.077928 kernel: pci 0000:00:04.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 18:29:37.078020 kernel: pci 0000:00:04.2: BAR 0 [mem 0x8438b000-0x8438bfff] Jan 23 18:29:37.078111 kernel: pci 0000:00:04.2: PCI bridge to [bus 14] Jan 23 18:29:37.078203 kernel: pci 0000:00:04.2: bridge window [mem 0x81c00000-0x81dfffff] Jan 23 18:29:37.078294 kernel: pci 0000:00:04.2: bridge window [mem 0x389000000000-0x3897ffffffff 64bit pref] Jan 23 18:29:37.078394 kernel: pci 0000:00:04.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 18:29:37.078489 kernel: pci 0000:00:04.3: BAR 0 [mem 0x8438a000-0x8438afff] Jan 23 18:29:37.078580 kernel: pci 0000:00:04.3: PCI bridge to [bus 15] Jan 23 18:29:37.078680 kernel: pci 0000:00:04.3: bridge window [mem 0x81a00000-0x81bfffff] Jan 23 18:29:37.078772 kernel: pci 0000:00:04.3: bridge window [mem 0x389800000000-0x389fffffffff 64bit pref] Jan 23 18:29:37.078868 kernel: pci 0000:00:04.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 18:29:37.078964 kernel: pci 0000:00:04.4: BAR 0 [mem 0x84389000-0x84389fff] Jan 23 18:29:37.079057 kernel: pci 0000:00:04.4: PCI bridge to [bus 16] Jan 23 18:29:37.079148 kernel: pci 0000:00:04.4: bridge window [mem 0x81800000-0x819fffff] Jan 23 18:29:37.079239 kernel: pci 0000:00:04.4: bridge window [mem 0x38a000000000-0x38a7ffffffff 64bit pref] Jan 23 18:29:37.079338 kernel: pci 0000:00:04.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 18:29:37.080814 kernel: pci 0000:00:04.5: BAR 0 [mem 0x84388000-0x84388fff] Jan 23 18:29:37.080932 kernel: pci 0000:00:04.5: PCI bridge to [bus 17] Jan 23 18:29:37.081028 kernel: pci 0000:00:04.5: bridge window [mem 0x81600000-0x817fffff] Jan 23 18:29:37.081121 kernel: pci 0000:00:04.5: bridge window [mem 0x38a800000000-0x38afffffffff 64bit pref] Jan 23 18:29:37.081222 kernel: pci 0000:00:04.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 18:29:37.081319 kernel: pci 0000:00:04.6: BAR 0 [mem 0x84387000-0x84387fff] Jan 23 18:29:37.081416 kernel: pci 0000:00:04.6: PCI bridge to [bus 18] Jan 23 18:29:37.081508 kernel: pci 0000:00:04.6: bridge window [mem 0x81400000-0x815fffff] Jan 23 18:29:37.081622 kernel: pci 0000:00:04.6: bridge window [mem 0x38b000000000-0x38b7ffffffff 64bit pref] Jan 23 18:29:37.081725 kernel: pci 0000:00:04.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 18:29:37.081824 kernel: pci 0000:00:04.7: BAR 0 [mem 0x84386000-0x84386fff] Jan 23 18:29:37.085118 kernel: pci 0000:00:04.7: PCI bridge to [bus 19] Jan 23 18:29:37.085220 kernel: pci 0000:00:04.7: bridge window [mem 0x81200000-0x813fffff] Jan 23 18:29:37.085312 kernel: pci 0000:00:04.7: bridge window [mem 0x38b800000000-0x38bfffffffff 64bit pref] Jan 23 18:29:37.085416 kernel: pci 0000:00:05.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 18:29:37.085508 kernel: pci 0000:00:05.0: BAR 0 [mem 0x84385000-0x84385fff] Jan 23 18:29:37.085647 kernel: pci 0000:00:05.0: PCI bridge to [bus 1a] Jan 23 18:29:37.085740 kernel: pci 0000:00:05.0: bridge window [mem 0x81000000-0x811fffff] Jan 23 18:29:37.085830 kernel: pci 0000:00:05.0: bridge window [mem 0x38c000000000-0x38c7ffffffff 64bit pref] Jan 23 18:29:37.085928 kernel: pci 0000:00:05.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 18:29:37.086021 kernel: pci 0000:00:05.1: BAR 0 [mem 0x84384000-0x84384fff] Jan 23 18:29:37.086110 kernel: pci 0000:00:05.1: PCI bridge to [bus 1b] Jan 23 18:29:37.086202 kernel: pci 0000:00:05.1: bridge window [mem 0x80e00000-0x80ffffff] Jan 23 18:29:37.086291 kernel: pci 0000:00:05.1: bridge window [mem 0x38c800000000-0x38cfffffffff 64bit pref] Jan 23 18:29:37.086388 kernel: pci 0000:00:05.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 18:29:37.086481 kernel: pci 0000:00:05.2: BAR 0 [mem 0x84383000-0x84383fff] Jan 23 18:29:37.086573 kernel: pci 0000:00:05.2: PCI bridge to [bus 1c] Jan 23 18:29:37.086682 kernel: pci 0000:00:05.2: bridge window [mem 0x80c00000-0x80dfffff] Jan 23 18:29:37.086777 kernel: pci 0000:00:05.2: bridge window [mem 0x38d000000000-0x38d7ffffffff 64bit pref] Jan 23 18:29:37.086874 kernel: pci 0000:00:05.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 18:29:37.086966 kernel: pci 0000:00:05.3: BAR 0 [mem 0x84382000-0x84382fff] Jan 23 18:29:37.087059 kernel: pci 0000:00:05.3: PCI bridge to [bus 1d] Jan 23 18:29:37.087151 kernel: pci 0000:00:05.3: bridge window [mem 0x80a00000-0x80bfffff] Jan 23 18:29:37.087243 kernel: pci 0000:00:05.3: bridge window [mem 0x38d800000000-0x38dfffffffff 64bit pref] Jan 23 18:29:37.087343 kernel: pci 0000:00:05.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 18:29:37.087437 kernel: pci 0000:00:05.4: BAR 0 [mem 0x84381000-0x84381fff] Jan 23 18:29:37.087531 kernel: pci 0000:00:05.4: PCI bridge to [bus 1e] Jan 23 18:29:37.087635 kernel: pci 0000:00:05.4: bridge window [mem 0x80800000-0x809fffff] Jan 23 18:29:37.087729 kernel: pci 0000:00:05.4: bridge window [mem 0x38e000000000-0x38e7ffffffff 64bit pref] Jan 23 18:29:37.087831 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 conventional PCI endpoint Jan 23 18:29:37.087930 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Jan 23 18:29:37.088030 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 conventional PCI endpoint Jan 23 18:29:37.088123 kernel: pci 0000:00:1f.2: BAR 4 [io 0x7040-0x705f] Jan 23 18:29:37.088215 kernel: pci 0000:00:1f.2: BAR 5 [mem 0x84380000-0x84380fff] Jan 23 18:29:37.088310 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 conventional PCI endpoint Jan 23 18:29:37.088402 kernel: pci 0000:00:1f.3: BAR 4 [io 0x7000-0x703f] Jan 23 18:29:37.088506 kernel: pci 0000:01:00.0: [1b36:000e] type 01 class 0x060400 PCIe to PCI/PCI-X bridge Jan 23 18:29:37.088608 kernel: pci 0000:01:00.0: BAR 0 [mem 0x84200000-0x842000ff 64bit] Jan 23 18:29:37.088704 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Jan 23 18:29:37.088799 kernel: pci 0000:01:00.0: bridge window [io 0x6000-0x6fff] Jan 23 18:29:37.088893 kernel: pci 0000:01:00.0: bridge window [mem 0x84000000-0x841fffff] Jan 23 18:29:37.088988 kernel: pci 0000:01:00.0: bridge window [mem 0x380000000000-0x3807ffffffff 64bit pref] Jan 23 18:29:37.089083 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Jan 23 18:29:37.089186 kernel: pci_bus 0000:02: extended config space not accessible Jan 23 18:29:37.089197 kernel: acpiphp: Slot [1] registered Jan 23 18:29:37.089206 kernel: acpiphp: Slot [0] registered Jan 23 18:29:37.089214 kernel: acpiphp: Slot [2] registered Jan 23 18:29:37.089222 kernel: acpiphp: Slot [3] registered Jan 23 18:29:37.089232 kernel: acpiphp: Slot [4] registered Jan 23 18:29:37.089240 kernel: acpiphp: Slot [5] registered Jan 23 18:29:37.089248 kernel: acpiphp: Slot [6] registered Jan 23 18:29:37.089256 kernel: acpiphp: Slot [7] registered Jan 23 18:29:37.089264 kernel: acpiphp: Slot [8] registered Jan 23 18:29:37.089272 kernel: acpiphp: Slot [9] registered Jan 23 18:29:37.089280 kernel: acpiphp: Slot [10] registered Jan 23 18:29:37.089290 kernel: acpiphp: Slot [11] registered Jan 23 18:29:37.089298 kernel: acpiphp: Slot [12] registered Jan 23 18:29:37.089306 kernel: acpiphp: Slot [13] registered Jan 23 18:29:37.089314 kernel: acpiphp: Slot [14] registered Jan 23 18:29:37.089322 kernel: acpiphp: Slot [15] registered Jan 23 18:29:37.089331 kernel: acpiphp: Slot [16] registered Jan 23 18:29:37.089339 kernel: acpiphp: Slot [17] registered Jan 23 18:29:37.089347 kernel: acpiphp: Slot [18] registered Jan 23 18:29:37.089356 kernel: acpiphp: Slot [19] registered Jan 23 18:29:37.089364 kernel: acpiphp: Slot [20] registered Jan 23 18:29:37.089372 kernel: acpiphp: Slot [21] registered Jan 23 18:29:37.089380 kernel: acpiphp: Slot [22] registered Jan 23 18:29:37.089388 kernel: acpiphp: Slot [23] registered Jan 23 18:29:37.089396 kernel: acpiphp: Slot [24] registered Jan 23 18:29:37.089404 kernel: acpiphp: Slot [25] registered Jan 23 18:29:37.089414 kernel: acpiphp: Slot [26] registered Jan 23 18:29:37.089424 kernel: acpiphp: Slot [27] registered Jan 23 18:29:37.089432 kernel: acpiphp: Slot [28] registered Jan 23 18:29:37.089440 kernel: acpiphp: Slot [29] registered Jan 23 18:29:37.089448 kernel: acpiphp: Slot [30] registered Jan 23 18:29:37.089456 kernel: acpiphp: Slot [31] registered Jan 23 18:29:37.089576 kernel: pci 0000:02:01.0: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint Jan 23 18:29:37.092758 kernel: pci 0000:02:01.0: BAR 4 [io 0x6000-0x601f] Jan 23 18:29:37.092976 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Jan 23 18:29:37.092992 kernel: acpiphp: Slot [0-2] registered Jan 23 18:29:37.093098 kernel: pci 0000:03:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Jan 23 18:29:37.093272 kernel: pci 0000:03:00.0: BAR 1 [mem 0x83e00000-0x83e00fff] Jan 23 18:29:37.093375 kernel: pci 0000:03:00.0: BAR 4 [mem 0x380800000000-0x380800003fff 64bit pref] Jan 23 18:29:37.093480 kernel: pci 0000:03:00.0: ROM [mem 0xfff80000-0xffffffff pref] Jan 23 18:29:37.093614 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Jan 23 18:29:37.093634 kernel: acpiphp: Slot [0-3] registered Jan 23 18:29:37.093747 kernel: pci 0000:04:00.0: [1af4:1042] type 00 class 0x010000 PCIe Endpoint Jan 23 18:29:37.093844 kernel: pci 0000:04:00.0: BAR 1 [mem 0x83c00000-0x83c00fff] Jan 23 18:29:37.093940 kernel: pci 0000:04:00.0: BAR 4 [mem 0x381000000000-0x381000003fff 64bit pref] Jan 23 18:29:37.094037 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Jan 23 18:29:37.094048 kernel: acpiphp: Slot [0-4] registered Jan 23 18:29:37.094147 kernel: pci 0000:05:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Jan 23 18:29:37.094243 kernel: pci 0000:05:00.0: BAR 4 [mem 0x381800000000-0x381800003fff 64bit pref] Jan 23 18:29:37.094336 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Jan 23 18:29:37.094347 kernel: acpiphp: Slot [0-5] registered Jan 23 18:29:37.094450 kernel: pci 0000:06:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Jan 23 18:29:37.094543 kernel: pci 0000:06:00.0: BAR 1 [mem 0x83800000-0x83800fff] Jan 23 18:29:37.094643 kernel: pci 0000:06:00.0: BAR 4 [mem 0x382000000000-0x382000003fff 64bit pref] Jan 23 18:29:37.094735 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Jan 23 18:29:37.094746 kernel: acpiphp: Slot [0-6] registered Jan 23 18:29:37.094835 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Jan 23 18:29:37.094847 kernel: acpiphp: Slot [0-7] registered Jan 23 18:29:37.094937 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Jan 23 18:29:37.094948 kernel: acpiphp: Slot [0-8] registered Jan 23 18:29:37.095038 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Jan 23 18:29:37.095049 kernel: acpiphp: Slot [0-9] registered Jan 23 18:29:37.095142 kernel: pci 0000:00:03.0: PCI bridge to [bus 0a] Jan 23 18:29:37.095154 kernel: acpiphp: Slot [0-10] registered Jan 23 18:29:37.095247 kernel: pci 0000:00:03.1: PCI bridge to [bus 0b] Jan 23 18:29:37.095258 kernel: acpiphp: Slot [0-11] registered Jan 23 18:29:37.095349 kernel: pci 0000:00:03.2: PCI bridge to [bus 0c] Jan 23 18:29:37.095360 kernel: acpiphp: Slot [0-12] registered Jan 23 18:29:37.095451 kernel: pci 0000:00:03.3: PCI bridge to [bus 0d] Jan 23 18:29:37.095462 kernel: acpiphp: Slot [0-13] registered Jan 23 18:29:37.095563 kernel: pci 0000:00:03.4: PCI bridge to [bus 0e] Jan 23 18:29:37.095574 kernel: acpiphp: Slot [0-14] registered Jan 23 18:29:37.095680 kernel: pci 0000:00:03.5: PCI bridge to [bus 0f] Jan 23 18:29:37.095692 kernel: acpiphp: Slot [0-15] registered Jan 23 18:29:37.095783 kernel: pci 0000:00:03.6: PCI bridge to [bus 10] Jan 23 18:29:37.095793 kernel: acpiphp: Slot [0-16] registered Jan 23 18:29:37.095887 kernel: pci 0000:00:03.7: PCI bridge to [bus 11] Jan 23 18:29:37.095898 kernel: acpiphp: Slot [0-17] registered Jan 23 18:29:37.095990 kernel: pci 0000:00:04.0: PCI bridge to [bus 12] Jan 23 18:29:37.096001 kernel: acpiphp: Slot [0-18] registered Jan 23 18:29:37.096092 kernel: pci 0000:00:04.1: PCI bridge to [bus 13] Jan 23 18:29:37.096103 kernel: acpiphp: Slot [0-19] registered Jan 23 18:29:37.096193 kernel: pci 0000:00:04.2: PCI bridge to [bus 14] Jan 23 18:29:37.096206 kernel: acpiphp: Slot [0-20] registered Jan 23 18:29:37.096299 kernel: pci 0000:00:04.3: PCI bridge to [bus 15] Jan 23 18:29:37.096309 kernel: acpiphp: Slot [0-21] registered Jan 23 18:29:37.096398 kernel: pci 0000:00:04.4: PCI bridge to [bus 16] Jan 23 18:29:37.096408 kernel: acpiphp: Slot [0-22] registered Jan 23 18:29:37.096496 kernel: pci 0000:00:04.5: PCI bridge to [bus 17] Jan 23 18:29:37.096509 kernel: acpiphp: Slot [0-23] registered Jan 23 18:29:37.096598 kernel: pci 0000:00:04.6: PCI bridge to [bus 18] Jan 23 18:29:37.096616 kernel: acpiphp: Slot [0-24] registered Jan 23 18:29:37.096705 kernel: pci 0000:00:04.7: PCI bridge to [bus 19] Jan 23 18:29:37.096716 kernel: acpiphp: Slot [0-25] registered Jan 23 18:29:37.096804 kernel: pci 0000:00:05.0: PCI bridge to [bus 1a] Jan 23 18:29:37.096817 kernel: acpiphp: Slot [0-26] registered Jan 23 18:29:37.096906 kernel: pci 0000:00:05.1: PCI bridge to [bus 1b] Jan 23 18:29:37.096917 kernel: acpiphp: Slot [0-27] registered Jan 23 18:29:37.097005 kernel: pci 0000:00:05.2: PCI bridge to [bus 1c] Jan 23 18:29:37.097015 kernel: acpiphp: Slot [0-28] registered Jan 23 18:29:37.097104 kernel: pci 0000:00:05.3: PCI bridge to [bus 1d] Jan 23 18:29:37.097114 kernel: acpiphp: Slot [0-29] registered Jan 23 18:29:37.097217 kernel: pci 0000:00:05.4: PCI bridge to [bus 1e] Jan 23 18:29:37.097229 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Jan 23 18:29:37.097237 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Jan 23 18:29:37.097245 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Jan 23 18:29:37.097254 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Jan 23 18:29:37.097262 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Jan 23 18:29:37.097270 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Jan 23 18:29:37.097280 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Jan 23 18:29:37.097289 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Jan 23 18:29:37.097297 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Jan 23 18:29:37.097305 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Jan 23 18:29:37.097314 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Jan 23 18:29:37.097322 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Jan 23 18:29:37.097330 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Jan 23 18:29:37.097340 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Jan 23 18:29:37.097348 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Jan 23 18:29:37.097356 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Jan 23 18:29:37.097364 kernel: iommu: Default domain type: Translated Jan 23 18:29:37.097372 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Jan 23 18:29:37.097380 kernel: efivars: Registered efivars operations Jan 23 18:29:37.097389 kernel: PCI: Using ACPI for IRQ routing Jan 23 18:29:37.097399 kernel: PCI: pci_cache_line_size set to 64 bytes Jan 23 18:29:37.097407 kernel: e820: reserve RAM buffer [mem 0x0080b000-0x008fffff] Jan 23 18:29:37.097415 kernel: e820: reserve RAM buffer [mem 0x00811000-0x008fffff] Jan 23 18:29:37.097423 kernel: e820: reserve RAM buffer [mem 0x7df57018-0x7fffffff] Jan 23 18:29:37.097431 kernel: e820: reserve RAM buffer [mem 0x7df7f018-0x7fffffff] Jan 23 18:29:37.097439 kernel: e820: reserve RAM buffer [mem 0x7e93f000-0x7fffffff] Jan 23 18:29:37.097447 kernel: e820: reserve RAM buffer [mem 0x7ec71000-0x7fffffff] Jan 23 18:29:37.097457 kernel: e820: reserve RAM buffer [mem 0x7f8ed000-0x7fffffff] Jan 23 18:29:37.097465 kernel: e820: reserve RAM buffer [mem 0x7feaf000-0x7fffffff] Jan 23 18:29:37.097473 kernel: e820: reserve RAM buffer [mem 0x7feec000-0x7fffffff] Jan 23 18:29:37.097577 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Jan 23 18:29:37.097685 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Jan 23 18:29:37.097779 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Jan 23 18:29:37.097789 kernel: vgaarb: loaded Jan 23 18:29:37.097800 kernel: clocksource: Switched to clocksource kvm-clock Jan 23 18:29:37.097808 kernel: VFS: Disk quotas dquot_6.6.0 Jan 23 18:29:37.097817 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Jan 23 18:29:37.097825 kernel: pnp: PnP ACPI init Jan 23 18:29:37.097926 kernel: system 00:04: [mem 0xe0000000-0xefffffff window] has been reserved Jan 23 18:29:37.097938 kernel: pnp: PnP ACPI: found 5 devices Jan 23 18:29:37.097948 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Jan 23 18:29:37.097957 kernel: NET: Registered PF_INET protocol family Jan 23 18:29:37.097965 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Jan 23 18:29:37.097973 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Jan 23 18:29:37.097981 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Jan 23 18:29:37.097990 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Jan 23 18:29:37.097998 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Jan 23 18:29:37.098008 kernel: TCP: Hash tables configured (established 32768 bind 32768) Jan 23 18:29:37.098016 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Jan 23 18:29:37.098024 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Jan 23 18:29:37.098032 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Jan 23 18:29:37.098041 kernel: NET: Registered PF_XDP protocol family Jan 23 18:29:37.098138 kernel: pci 0000:03:00.0: ROM [mem 0xfff80000-0xffffffff pref]: can't claim; no compatible bridge window Jan 23 18:29:37.098231 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Jan 23 18:29:37.098327 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Jan 23 18:29:37.098418 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Jan 23 18:29:37.098508 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Jan 23 18:29:37.098595 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Jan 23 18:29:37.098689 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Jan 23 18:29:37.098779 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Jan 23 18:29:37.098872 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 Jan 23 18:29:37.098961 kernel: pci 0000:00:03.1: bridge window [io 0x1000-0x0fff] to [bus 0b] add_size 1000 Jan 23 18:29:37.099053 kernel: pci 0000:00:03.2: bridge window [io 0x1000-0x0fff] to [bus 0c] add_size 1000 Jan 23 18:29:37.099143 kernel: pci 0000:00:03.3: bridge window [io 0x1000-0x0fff] to [bus 0d] add_size 1000 Jan 23 18:29:37.099227 kernel: pci 0000:00:03.4: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 Jan 23 18:29:37.099314 kernel: pci 0000:00:03.5: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 Jan 23 18:29:37.099403 kernel: pci 0000:00:03.6: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 Jan 23 18:29:37.099492 kernel: pci 0000:00:03.7: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 Jan 23 18:29:37.099577 kernel: pci 0000:00:04.0: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 Jan 23 18:29:37.099673 kernel: pci 0000:00:04.1: bridge window [io 0x1000-0x0fff] to [bus 13] add_size 1000 Jan 23 18:29:37.099758 kernel: pci 0000:00:04.2: bridge window [io 0x1000-0x0fff] to [bus 14] add_size 1000 Jan 23 18:29:37.099842 kernel: pci 0000:00:04.3: bridge window [io 0x1000-0x0fff] to [bus 15] add_size 1000 Jan 23 18:29:37.099926 kernel: pci 0000:00:04.4: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 Jan 23 18:29:37.100012 kernel: pci 0000:00:04.5: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 Jan 23 18:29:37.100095 kernel: pci 0000:00:04.6: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 Jan 23 18:29:37.100181 kernel: pci 0000:00:04.7: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 Jan 23 18:29:37.100269 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 Jan 23 18:29:37.100356 kernel: pci 0000:00:05.1: bridge window [io 0x1000-0x0fff] to [bus 1b] add_size 1000 Jan 23 18:29:37.100437 kernel: pci 0000:00:05.2: bridge window [io 0x1000-0x0fff] to [bus 1c] add_size 1000 Jan 23 18:29:37.100521 kernel: pci 0000:00:05.3: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 Jan 23 18:29:37.100609 kernel: pci 0000:00:05.4: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 Jan 23 18:29:37.100704 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x1fff]: assigned Jan 23 18:29:37.100785 kernel: pci 0000:00:02.2: bridge window [io 0x2000-0x2fff]: assigned Jan 23 18:29:37.100867 kernel: pci 0000:00:02.3: bridge window [io 0x3000-0x3fff]: assigned Jan 23 18:29:37.100949 kernel: pci 0000:00:02.4: bridge window [io 0x4000-0x4fff]: assigned Jan 23 18:29:37.101033 kernel: pci 0000:00:02.5: bridge window [io 0x5000-0x5fff]: assigned Jan 23 18:29:37.101119 kernel: pci 0000:00:02.6: bridge window [io 0x8000-0x8fff]: assigned Jan 23 18:29:37.101207 kernel: pci 0000:00:02.7: bridge window [io 0x9000-0x9fff]: assigned Jan 23 18:29:37.101337 kernel: pci 0000:00:03.0: bridge window [io 0xa000-0xafff]: assigned Jan 23 18:29:37.101437 kernel: pci 0000:00:03.1: bridge window [io 0xb000-0xbfff]: assigned Jan 23 18:29:37.101539 kernel: pci 0000:00:03.2: bridge window [io 0xc000-0xcfff]: assigned Jan 23 18:29:37.101642 kernel: pci 0000:00:03.3: bridge window [io 0xd000-0xdfff]: assigned Jan 23 18:29:37.101734 kernel: pci 0000:00:03.4: bridge window [io 0xe000-0xefff]: assigned Jan 23 18:29:37.101823 kernel: pci 0000:00:03.5: bridge window [io 0xf000-0xffff]: assigned Jan 23 18:29:37.101918 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: can't assign; no space Jan 23 18:29:37.102007 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: failed to assign Jan 23 18:29:37.102096 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: can't assign; no space Jan 23 18:29:37.102184 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: failed to assign Jan 23 18:29:37.102273 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: can't assign; no space Jan 23 18:29:37.102369 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: failed to assign Jan 23 18:29:37.102452 kernel: pci 0000:00:04.1: bridge window [io size 0x1000]: can't assign; no space Jan 23 18:29:37.102535 kernel: pci 0000:00:04.1: bridge window [io size 0x1000]: failed to assign Jan 23 18:29:37.102631 kernel: pci 0000:00:04.2: bridge window [io size 0x1000]: can't assign; no space Jan 23 18:29:37.102716 kernel: pci 0000:00:04.2: bridge window [io size 0x1000]: failed to assign Jan 23 18:29:37.102803 kernel: pci 0000:00:04.3: bridge window [io size 0x1000]: can't assign; no space Jan 23 18:29:37.102894 kernel: pci 0000:00:04.3: bridge window [io size 0x1000]: failed to assign Jan 23 18:29:37.102985 kernel: pci 0000:00:04.4: bridge window [io size 0x1000]: can't assign; no space Jan 23 18:29:37.103077 kernel: pci 0000:00:04.4: bridge window [io size 0x1000]: failed to assign Jan 23 18:29:37.103168 kernel: pci 0000:00:04.5: bridge window [io size 0x1000]: can't assign; no space Jan 23 18:29:37.103257 kernel: pci 0000:00:04.5: bridge window [io size 0x1000]: failed to assign Jan 23 18:29:37.103345 kernel: pci 0000:00:04.6: bridge window [io size 0x1000]: can't assign; no space Jan 23 18:29:37.103431 kernel: pci 0000:00:04.6: bridge window [io size 0x1000]: failed to assign Jan 23 18:29:37.103514 kernel: pci 0000:00:04.7: bridge window [io size 0x1000]: can't assign; no space Jan 23 18:29:37.103597 kernel: pci 0000:00:04.7: bridge window [io size 0x1000]: failed to assign Jan 23 18:29:37.103700 kernel: pci 0000:00:05.0: bridge window [io size 0x1000]: can't assign; no space Jan 23 18:29:37.103789 kernel: pci 0000:00:05.0: bridge window [io size 0x1000]: failed to assign Jan 23 18:29:37.103878 kernel: pci 0000:00:05.1: bridge window [io size 0x1000]: can't assign; no space Jan 23 18:29:37.103967 kernel: pci 0000:00:05.1: bridge window [io size 0x1000]: failed to assign Jan 23 18:29:37.104056 kernel: pci 0000:00:05.2: bridge window [io size 0x1000]: can't assign; no space Jan 23 18:29:37.105713 kernel: pci 0000:00:05.2: bridge window [io size 0x1000]: failed to assign Jan 23 18:29:37.105823 kernel: pci 0000:00:05.3: bridge window [io size 0x1000]: can't assign; no space Jan 23 18:29:37.105913 kernel: pci 0000:00:05.3: bridge window [io size 0x1000]: failed to assign Jan 23 18:29:37.106000 kernel: pci 0000:00:05.4: bridge window [io size 0x1000]: can't assign; no space Jan 23 18:29:37.106085 kernel: pci 0000:00:05.4: bridge window [io size 0x1000]: failed to assign Jan 23 18:29:37.106172 kernel: pci 0000:00:05.4: bridge window [io 0x1000-0x1fff]: assigned Jan 23 18:29:37.106257 kernel: pci 0000:00:05.3: bridge window [io 0x2000-0x2fff]: assigned Jan 23 18:29:37.106346 kernel: pci 0000:00:05.2: bridge window [io 0x3000-0x3fff]: assigned Jan 23 18:29:37.106436 kernel: pci 0000:00:05.1: bridge window [io 0x4000-0x4fff]: assigned Jan 23 18:29:37.106525 kernel: pci 0000:00:05.0: bridge window [io 0x5000-0x5fff]: assigned Jan 23 18:29:37.106625 kernel: pci 0000:00:04.7: bridge window [io 0x8000-0x8fff]: assigned Jan 23 18:29:37.106711 kernel: pci 0000:00:04.6: bridge window [io 0x9000-0x9fff]: assigned Jan 23 18:29:37.106800 kernel: pci 0000:00:04.5: bridge window [io 0xa000-0xafff]: assigned Jan 23 18:29:37.106887 kernel: pci 0000:00:04.4: bridge window [io 0xb000-0xbfff]: assigned Jan 23 18:29:37.106977 kernel: pci 0000:00:04.3: bridge window [io 0xc000-0xcfff]: assigned Jan 23 18:29:37.107066 kernel: pci 0000:00:04.2: bridge window [io 0xd000-0xdfff]: assigned Jan 23 18:29:37.107153 kernel: pci 0000:00:04.1: bridge window [io 0xe000-0xefff]: assigned Jan 23 18:29:37.107238 kernel: pci 0000:00:04.0: bridge window [io 0xf000-0xffff]: assigned Jan 23 18:29:37.107330 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: can't assign; no space Jan 23 18:29:37.107420 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: failed to assign Jan 23 18:29:37.107510 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: can't assign; no space Jan 23 18:29:37.107599 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: failed to assign Jan 23 18:29:37.107698 kernel: pci 0000:00:03.5: bridge window [io size 0x1000]: can't assign; no space Jan 23 18:29:37.107788 kernel: pci 0000:00:03.5: bridge window [io size 0x1000]: failed to assign Jan 23 18:29:37.107877 kernel: pci 0000:00:03.4: bridge window [io size 0x1000]: can't assign; no space Jan 23 18:29:37.107963 kernel: pci 0000:00:03.4: bridge window [io size 0x1000]: failed to assign Jan 23 18:29:37.108048 kernel: pci 0000:00:03.3: bridge window [io size 0x1000]: can't assign; no space Jan 23 18:29:37.108133 kernel: pci 0000:00:03.3: bridge window [io size 0x1000]: failed to assign Jan 23 18:29:37.108217 kernel: pci 0000:00:03.2: bridge window [io size 0x1000]: can't assign; no space Jan 23 18:29:37.108300 kernel: pci 0000:00:03.2: bridge window [io size 0x1000]: failed to assign Jan 23 18:29:37.108390 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: can't assign; no space Jan 23 18:29:37.108482 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: failed to assign Jan 23 18:29:37.108569 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: can't assign; no space Jan 23 18:29:37.110316 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: failed to assign Jan 23 18:29:37.110424 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: can't assign; no space Jan 23 18:29:37.110521 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: failed to assign Jan 23 18:29:37.110628 kernel: pci 0000:00:02.6: bridge window [io size 0x1000]: can't assign; no space Jan 23 18:29:37.110721 kernel: pci 0000:00:02.6: bridge window [io size 0x1000]: failed to assign Jan 23 18:29:37.110820 kernel: pci 0000:00:02.5: bridge window [io size 0x1000]: can't assign; no space Jan 23 18:29:37.110912 kernel: pci 0000:00:02.5: bridge window [io size 0x1000]: failed to assign Jan 23 18:29:37.111008 kernel: pci 0000:00:02.4: bridge window [io size 0x1000]: can't assign; no space Jan 23 18:29:37.111103 kernel: pci 0000:00:02.4: bridge window [io size 0x1000]: failed to assign Jan 23 18:29:37.111196 kernel: pci 0000:00:02.3: bridge window [io size 0x1000]: can't assign; no space Jan 23 18:29:37.111289 kernel: pci 0000:00:02.3: bridge window [io size 0x1000]: failed to assign Jan 23 18:29:37.111385 kernel: pci 0000:00:02.2: bridge window [io size 0x1000]: can't assign; no space Jan 23 18:29:37.111477 kernel: pci 0000:00:02.2: bridge window [io size 0x1000]: failed to assign Jan 23 18:29:37.111570 kernel: pci 0000:00:02.1: bridge window [io size 0x1000]: can't assign; no space Jan 23 18:29:37.111687 kernel: pci 0000:00:02.1: bridge window [io size 0x1000]: failed to assign Jan 23 18:29:37.111787 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Jan 23 18:29:37.111881 kernel: pci 0000:01:00.0: bridge window [io 0x6000-0x6fff] Jan 23 18:29:37.111975 kernel: pci 0000:01:00.0: bridge window [mem 0x84000000-0x841fffff] Jan 23 18:29:37.112073 kernel: pci 0000:01:00.0: bridge window [mem 0x380000000000-0x3807ffffffff 64bit pref] Jan 23 18:29:37.112166 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Jan 23 18:29:37.112258 kernel: pci 0000:00:02.0: bridge window [io 0x6000-0x6fff] Jan 23 18:29:37.112349 kernel: pci 0000:00:02.0: bridge window [mem 0x84000000-0x842fffff] Jan 23 18:29:37.112443 kernel: pci 0000:00:02.0: bridge window [mem 0x380000000000-0x3807ffffffff 64bit pref] Jan 23 18:29:37.112540 kernel: pci 0000:03:00.0: ROM [mem 0x83e80000-0x83efffff pref]: assigned Jan 23 18:29:37.112681 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Jan 23 18:29:37.112776 kernel: pci 0000:00:02.1: bridge window [mem 0x83e00000-0x83ffffff] Jan 23 18:29:37.112869 kernel: pci 0000:00:02.1: bridge window [mem 0x380800000000-0x380fffffffff 64bit pref] Jan 23 18:29:37.112962 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Jan 23 18:29:37.113053 kernel: pci 0000:00:02.2: bridge window [mem 0x83c00000-0x83dfffff] Jan 23 18:29:37.113145 kernel: pci 0000:00:02.2: bridge window [mem 0x381000000000-0x3817ffffffff 64bit pref] Jan 23 18:29:37.113236 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Jan 23 18:29:37.113327 kernel: pci 0000:00:02.3: bridge window [mem 0x83a00000-0x83bfffff] Jan 23 18:29:37.113418 kernel: pci 0000:00:02.3: bridge window [mem 0x381800000000-0x381fffffffff 64bit pref] Jan 23 18:29:37.113513 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Jan 23 18:29:37.113634 kernel: pci 0000:00:02.4: bridge window [mem 0x83800000-0x839fffff] Jan 23 18:29:37.113728 kernel: pci 0000:00:02.4: bridge window [mem 0x382000000000-0x3827ffffffff 64bit pref] Jan 23 18:29:37.113820 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Jan 23 18:29:37.113911 kernel: pci 0000:00:02.5: bridge window [mem 0x83600000-0x837fffff] Jan 23 18:29:37.114004 kernel: pci 0000:00:02.5: bridge window [mem 0x382800000000-0x382fffffffff 64bit pref] Jan 23 18:29:37.114099 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Jan 23 18:29:37.114190 kernel: pci 0000:00:02.6: bridge window [mem 0x83400000-0x835fffff] Jan 23 18:29:37.114283 kernel: pci 0000:00:02.6: bridge window [mem 0x383000000000-0x3837ffffffff 64bit pref] Jan 23 18:29:37.114374 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Jan 23 18:29:37.114464 kernel: pci 0000:00:02.7: bridge window [mem 0x83200000-0x833fffff] Jan 23 18:29:37.114555 kernel: pci 0000:00:02.7: bridge window [mem 0x383800000000-0x383fffffffff 64bit pref] Jan 23 18:29:37.114861 kernel: pci 0000:00:03.0: PCI bridge to [bus 0a] Jan 23 18:29:37.114953 kernel: pci 0000:00:03.0: bridge window [mem 0x83000000-0x831fffff] Jan 23 18:29:37.115037 kernel: pci 0000:00:03.0: bridge window [mem 0x384000000000-0x3847ffffffff 64bit pref] Jan 23 18:29:37.115122 kernel: pci 0000:00:03.1: PCI bridge to [bus 0b] Jan 23 18:29:37.115206 kernel: pci 0000:00:03.1: bridge window [mem 0x82e00000-0x82ffffff] Jan 23 18:29:37.115289 kernel: pci 0000:00:03.1: bridge window [mem 0x384800000000-0x384fffffffff 64bit pref] Jan 23 18:29:37.115372 kernel: pci 0000:00:03.2: PCI bridge to [bus 0c] Jan 23 18:29:37.115742 kernel: pci 0000:00:03.2: bridge window [mem 0x82c00000-0x82dfffff] Jan 23 18:29:37.115840 kernel: pci 0000:00:03.2: bridge window [mem 0x385000000000-0x3857ffffffff 64bit pref] Jan 23 18:29:37.115929 kernel: pci 0000:00:03.3: PCI bridge to [bus 0d] Jan 23 18:29:37.116018 kernel: pci 0000:00:03.3: bridge window [mem 0x82a00000-0x82bfffff] Jan 23 18:29:37.116107 kernel: pci 0000:00:03.3: bridge window [mem 0x385800000000-0x385fffffffff 64bit pref] Jan 23 18:29:37.116196 kernel: pci 0000:00:03.4: PCI bridge to [bus 0e] Jan 23 18:29:37.116279 kernel: pci 0000:00:03.4: bridge window [mem 0x82800000-0x829fffff] Jan 23 18:29:37.116363 kernel: pci 0000:00:03.4: bridge window [mem 0x386000000000-0x3867ffffffff 64bit pref] Jan 23 18:29:37.116446 kernel: pci 0000:00:03.5: PCI bridge to [bus 0f] Jan 23 18:29:37.116532 kernel: pci 0000:00:03.5: bridge window [mem 0x82600000-0x827fffff] Jan 23 18:29:37.116624 kernel: pci 0000:00:03.5: bridge window [mem 0x386800000000-0x386fffffffff 64bit pref] Jan 23 18:29:37.116710 kernel: pci 0000:00:03.6: PCI bridge to [bus 10] Jan 23 18:29:37.116792 kernel: pci 0000:00:03.6: bridge window [mem 0x82400000-0x825fffff] Jan 23 18:29:37.116876 kernel: pci 0000:00:03.6: bridge window [mem 0x387000000000-0x3877ffffffff 64bit pref] Jan 23 18:29:37.116961 kernel: pci 0000:00:03.7: PCI bridge to [bus 11] Jan 23 18:29:37.117046 kernel: pci 0000:00:03.7: bridge window [mem 0x82200000-0x823fffff] Jan 23 18:29:37.117129 kernel: pci 0000:00:03.7: bridge window [mem 0x387800000000-0x387fffffffff 64bit pref] Jan 23 18:29:37.117215 kernel: pci 0000:00:04.0: PCI bridge to [bus 12] Jan 23 18:29:37.117303 kernel: pci 0000:00:04.0: bridge window [io 0xf000-0xffff] Jan 23 18:29:37.117392 kernel: pci 0000:00:04.0: bridge window [mem 0x82000000-0x821fffff] Jan 23 18:29:37.117481 kernel: pci 0000:00:04.0: bridge window [mem 0x388000000000-0x3887ffffffff 64bit pref] Jan 23 18:29:37.117590 kernel: pci 0000:00:04.1: PCI bridge to [bus 13] Jan 23 18:29:37.117687 kernel: pci 0000:00:04.1: bridge window [io 0xe000-0xefff] Jan 23 18:29:37.117772 kernel: pci 0000:00:04.1: bridge window [mem 0x81e00000-0x81ffffff] Jan 23 18:29:37.117862 kernel: pci 0000:00:04.1: bridge window [mem 0x388800000000-0x388fffffffff 64bit pref] Jan 23 18:29:37.117951 kernel: pci 0000:00:04.2: PCI bridge to [bus 14] Jan 23 18:29:37.118039 kernel: pci 0000:00:04.2: bridge window [io 0xd000-0xdfff] Jan 23 18:29:37.119794 kernel: pci 0000:00:04.2: bridge window [mem 0x81c00000-0x81dfffff] Jan 23 18:29:37.119903 kernel: pci 0000:00:04.2: bridge window [mem 0x389000000000-0x3897ffffffff 64bit pref] Jan 23 18:29:37.119995 kernel: pci 0000:00:04.3: PCI bridge to [bus 15] Jan 23 18:29:37.120082 kernel: pci 0000:00:04.3: bridge window [io 0xc000-0xcfff] Jan 23 18:29:37.120167 kernel: pci 0000:00:04.3: bridge window [mem 0x81a00000-0x81bfffff] Jan 23 18:29:37.120253 kernel: pci 0000:00:04.3: bridge window [mem 0x389800000000-0x389fffffffff 64bit pref] Jan 23 18:29:37.120341 kernel: pci 0000:00:04.4: PCI bridge to [bus 16] Jan 23 18:29:37.120426 kernel: pci 0000:00:04.4: bridge window [io 0xb000-0xbfff] Jan 23 18:29:37.120510 kernel: pci 0000:00:04.4: bridge window [mem 0x81800000-0x819fffff] Jan 23 18:29:37.120593 kernel: pci 0000:00:04.4: bridge window [mem 0x38a000000000-0x38a7ffffffff 64bit pref] Jan 23 18:29:37.120692 kernel: pci 0000:00:04.5: PCI bridge to [bus 17] Jan 23 18:29:37.120778 kernel: pci 0000:00:04.5: bridge window [io 0xa000-0xafff] Jan 23 18:29:37.120877 kernel: pci 0000:00:04.5: bridge window [mem 0x81600000-0x817fffff] Jan 23 18:29:37.120967 kernel: pci 0000:00:04.5: bridge window [mem 0x38a800000000-0x38afffffffff 64bit pref] Jan 23 18:29:37.121058 kernel: pci 0000:00:04.6: PCI bridge to [bus 18] Jan 23 18:29:37.121146 kernel: pci 0000:00:04.6: bridge window [io 0x9000-0x9fff] Jan 23 18:29:37.121236 kernel: pci 0000:00:04.6: bridge window [mem 0x81400000-0x815fffff] Jan 23 18:29:37.121325 kernel: pci 0000:00:04.6: bridge window [mem 0x38b000000000-0x38b7ffffffff 64bit pref] Jan 23 18:29:37.121416 kernel: pci 0000:00:04.7: PCI bridge to [bus 19] Jan 23 18:29:37.121515 kernel: pci 0000:00:04.7: bridge window [io 0x8000-0x8fff] Jan 23 18:29:37.121669 kernel: pci 0000:00:04.7: bridge window [mem 0x81200000-0x813fffff] Jan 23 18:29:37.121761 kernel: pci 0000:00:04.7: bridge window [mem 0x38b800000000-0x38bfffffffff 64bit pref] Jan 23 18:29:37.121851 kernel: pci 0000:00:05.0: PCI bridge to [bus 1a] Jan 23 18:29:37.121941 kernel: pci 0000:00:05.0: bridge window [io 0x5000-0x5fff] Jan 23 18:29:37.122030 kernel: pci 0000:00:05.0: bridge window [mem 0x81000000-0x811fffff] Jan 23 18:29:37.122122 kernel: pci 0000:00:05.0: bridge window [mem 0x38c000000000-0x38c7ffffffff 64bit pref] Jan 23 18:29:37.122216 kernel: pci 0000:00:05.1: PCI bridge to [bus 1b] Jan 23 18:29:37.122305 kernel: pci 0000:00:05.1: bridge window [io 0x4000-0x4fff] Jan 23 18:29:37.122394 kernel: pci 0000:00:05.1: bridge window [mem 0x80e00000-0x80ffffff] Jan 23 18:29:37.122483 kernel: pci 0000:00:05.1: bridge window [mem 0x38c800000000-0x38cfffffffff 64bit pref] Jan 23 18:29:37.122574 kernel: pci 0000:00:05.2: PCI bridge to [bus 1c] Jan 23 18:29:37.122674 kernel: pci 0000:00:05.2: bridge window [io 0x3000-0x3fff] Jan 23 18:29:37.122763 kernel: pci 0000:00:05.2: bridge window [mem 0x80c00000-0x80dfffff] Jan 23 18:29:37.122854 kernel: pci 0000:00:05.2: bridge window [mem 0x38d000000000-0x38d7ffffffff 64bit pref] Jan 23 18:29:37.122947 kernel: pci 0000:00:05.3: PCI bridge to [bus 1d] Jan 23 18:29:37.123039 kernel: pci 0000:00:05.3: bridge window [io 0x2000-0x2fff] Jan 23 18:29:37.123130 kernel: pci 0000:00:05.3: bridge window [mem 0x80a00000-0x80bfffff] Jan 23 18:29:37.123222 kernel: pci 0000:00:05.3: bridge window [mem 0x38d800000000-0x38dfffffffff 64bit pref] Jan 23 18:29:37.123313 kernel: pci 0000:00:05.4: PCI bridge to [bus 1e] Jan 23 18:29:37.123402 kernel: pci 0000:00:05.4: bridge window [io 0x1000-0x1fff] Jan 23 18:29:37.123493 kernel: pci 0000:00:05.4: bridge window [mem 0x80800000-0x809fffff] Jan 23 18:29:37.123582 kernel: pci 0000:00:05.4: bridge window [mem 0x38e000000000-0x38e7ffffffff 64bit pref] Jan 23 18:29:37.124688 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Jan 23 18:29:37.124787 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Jan 23 18:29:37.124870 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Jan 23 18:29:37.124952 kernel: pci_bus 0000:00: resource 7 [mem 0x80000000-0xdfffffff window] Jan 23 18:29:37.125032 kernel: pci_bus 0000:00: resource 8 [mem 0xf0000000-0xfebfffff window] Jan 23 18:29:37.125114 kernel: pci_bus 0000:00: resource 9 [mem 0x380000000000-0x38e800003fff window] Jan 23 18:29:37.125211 kernel: pci_bus 0000:01: resource 0 [io 0x6000-0x6fff] Jan 23 18:29:37.125299 kernel: pci_bus 0000:01: resource 1 [mem 0x84000000-0x842fffff] Jan 23 18:29:37.125383 kernel: pci_bus 0000:01: resource 2 [mem 0x380000000000-0x3807ffffffff 64bit pref] Jan 23 18:29:37.125474 kernel: pci_bus 0000:02: resource 0 [io 0x6000-0x6fff] Jan 23 18:29:37.125579 kernel: pci_bus 0000:02: resource 1 [mem 0x84000000-0x841fffff] Jan 23 18:29:37.125681 kernel: pci_bus 0000:02: resource 2 [mem 0x380000000000-0x3807ffffffff 64bit pref] Jan 23 18:29:37.125772 kernel: pci_bus 0000:03: resource 1 [mem 0x83e00000-0x83ffffff] Jan 23 18:29:37.125858 kernel: pci_bus 0000:03: resource 2 [mem 0x380800000000-0x380fffffffff 64bit pref] Jan 23 18:29:37.125943 kernel: pci_bus 0000:04: resource 1 [mem 0x83c00000-0x83dfffff] Jan 23 18:29:37.126023 kernel: pci_bus 0000:04: resource 2 [mem 0x381000000000-0x3817ffffffff 64bit pref] Jan 23 18:29:37.126107 kernel: pci_bus 0000:05: resource 1 [mem 0x83a00000-0x83bfffff] Jan 23 18:29:37.126186 kernel: pci_bus 0000:05: resource 2 [mem 0x381800000000-0x381fffffffff 64bit pref] Jan 23 18:29:37.126273 kernel: pci_bus 0000:06: resource 1 [mem 0x83800000-0x839fffff] Jan 23 18:29:37.126352 kernel: pci_bus 0000:06: resource 2 [mem 0x382000000000-0x3827ffffffff 64bit pref] Jan 23 18:29:37.126438 kernel: pci_bus 0000:07: resource 1 [mem 0x83600000-0x837fffff] Jan 23 18:29:37.126519 kernel: pci_bus 0000:07: resource 2 [mem 0x382800000000-0x382fffffffff 64bit pref] Jan 23 18:29:37.127968 kernel: pci_bus 0000:08: resource 1 [mem 0x83400000-0x835fffff] Jan 23 18:29:37.128077 kernel: pci_bus 0000:08: resource 2 [mem 0x383000000000-0x3837ffffffff 64bit pref] Jan 23 18:29:37.128168 kernel: pci_bus 0000:09: resource 1 [mem 0x83200000-0x833fffff] Jan 23 18:29:37.128253 kernel: pci_bus 0000:09: resource 2 [mem 0x383800000000-0x383fffffffff 64bit pref] Jan 23 18:29:37.128344 kernel: pci_bus 0000:0a: resource 1 [mem 0x83000000-0x831fffff] Jan 23 18:29:37.128428 kernel: pci_bus 0000:0a: resource 2 [mem 0x384000000000-0x3847ffffffff 64bit pref] Jan 23 18:29:37.128517 kernel: pci_bus 0000:0b: resource 1 [mem 0x82e00000-0x82ffffff] Jan 23 18:29:37.128617 kernel: pci_bus 0000:0b: resource 2 [mem 0x384800000000-0x384fffffffff 64bit pref] Jan 23 18:29:37.128709 kernel: pci_bus 0000:0c: resource 1 [mem 0x82c00000-0x82dfffff] Jan 23 18:29:37.128789 kernel: pci_bus 0000:0c: resource 2 [mem 0x385000000000-0x3857ffffffff 64bit pref] Jan 23 18:29:37.128873 kernel: pci_bus 0000:0d: resource 1 [mem 0x82a00000-0x82bfffff] Jan 23 18:29:37.128956 kernel: pci_bus 0000:0d: resource 2 [mem 0x385800000000-0x385fffffffff 64bit pref] Jan 23 18:29:37.129043 kernel: pci_bus 0000:0e: resource 1 [mem 0x82800000-0x829fffff] Jan 23 18:29:37.129131 kernel: pci_bus 0000:0e: resource 2 [mem 0x386000000000-0x3867ffffffff 64bit pref] Jan 23 18:29:37.129221 kernel: pci_bus 0000:0f: resource 1 [mem 0x82600000-0x827fffff] Jan 23 18:29:37.129307 kernel: pci_bus 0000:0f: resource 2 [mem 0x386800000000-0x386fffffffff 64bit pref] Jan 23 18:29:37.129403 kernel: pci_bus 0000:10: resource 1 [mem 0x82400000-0x825fffff] Jan 23 18:29:37.129491 kernel: pci_bus 0000:10: resource 2 [mem 0x387000000000-0x3877ffffffff 64bit pref] Jan 23 18:29:37.130024 kernel: pci_bus 0000:11: resource 1 [mem 0x82200000-0x823fffff] Jan 23 18:29:37.131756 kernel: pci_bus 0000:11: resource 2 [mem 0x387800000000-0x387fffffffff 64bit pref] Jan 23 18:29:37.131866 kernel: pci_bus 0000:12: resource 0 [io 0xf000-0xffff] Jan 23 18:29:37.131957 kernel: pci_bus 0000:12: resource 1 [mem 0x82000000-0x821fffff] Jan 23 18:29:37.132041 kernel: pci_bus 0000:12: resource 2 [mem 0x388000000000-0x3887ffffffff 64bit pref] Jan 23 18:29:37.132131 kernel: pci_bus 0000:13: resource 0 [io 0xe000-0xefff] Jan 23 18:29:37.132216 kernel: pci_bus 0000:13: resource 1 [mem 0x81e00000-0x81ffffff] Jan 23 18:29:37.132299 kernel: pci_bus 0000:13: resource 2 [mem 0x388800000000-0x388fffffffff 64bit pref] Jan 23 18:29:37.132391 kernel: pci_bus 0000:14: resource 0 [io 0xd000-0xdfff] Jan 23 18:29:37.132478 kernel: pci_bus 0000:14: resource 1 [mem 0x81c00000-0x81dfffff] Jan 23 18:29:37.132561 kernel: pci_bus 0000:14: resource 2 [mem 0x389000000000-0x3897ffffffff 64bit pref] Jan 23 18:29:37.132673 kernel: pci_bus 0000:15: resource 0 [io 0xc000-0xcfff] Jan 23 18:29:37.132753 kernel: pci_bus 0000:15: resource 1 [mem 0x81a00000-0x81bfffff] Jan 23 18:29:37.132838 kernel: pci_bus 0000:15: resource 2 [mem 0x389800000000-0x389fffffffff 64bit pref] Jan 23 18:29:37.132921 kernel: pci_bus 0000:16: resource 0 [io 0xb000-0xbfff] Jan 23 18:29:37.133003 kernel: pci_bus 0000:16: resource 1 [mem 0x81800000-0x819fffff] Jan 23 18:29:37.133081 kernel: pci_bus 0000:16: resource 2 [mem 0x38a000000000-0x38a7ffffffff 64bit pref] Jan 23 18:29:37.133167 kernel: pci_bus 0000:17: resource 0 [io 0xa000-0xafff] Jan 23 18:29:37.133251 kernel: pci_bus 0000:17: resource 1 [mem 0x81600000-0x817fffff] Jan 23 18:29:37.133334 kernel: pci_bus 0000:17: resource 2 [mem 0x38a800000000-0x38afffffffff 64bit pref] Jan 23 18:29:37.133428 kernel: pci_bus 0000:18: resource 0 [io 0x9000-0x9fff] Jan 23 18:29:37.133516 kernel: pci_bus 0000:18: resource 1 [mem 0x81400000-0x815fffff] Jan 23 18:29:37.135293 kernel: pci_bus 0000:18: resource 2 [mem 0x38b000000000-0x38b7ffffffff 64bit pref] Jan 23 18:29:37.135399 kernel: pci_bus 0000:19: resource 0 [io 0x8000-0x8fff] Jan 23 18:29:37.135480 kernel: pci_bus 0000:19: resource 1 [mem 0x81200000-0x813fffff] Jan 23 18:29:37.135559 kernel: pci_bus 0000:19: resource 2 [mem 0x38b800000000-0x38bfffffffff 64bit pref] Jan 23 18:29:37.135672 kernel: pci_bus 0000:1a: resource 0 [io 0x5000-0x5fff] Jan 23 18:29:37.135751 kernel: pci_bus 0000:1a: resource 1 [mem 0x81000000-0x811fffff] Jan 23 18:29:37.135830 kernel: pci_bus 0000:1a: resource 2 [mem 0x38c000000000-0x38c7ffffffff 64bit pref] Jan 23 18:29:37.135912 kernel: pci_bus 0000:1b: resource 0 [io 0x4000-0x4fff] Jan 23 18:29:37.135989 kernel: pci_bus 0000:1b: resource 1 [mem 0x80e00000-0x80ffffff] Jan 23 18:29:37.136068 kernel: pci_bus 0000:1b: resource 2 [mem 0x38c800000000-0x38cfffffffff 64bit pref] Jan 23 18:29:37.136150 kernel: pci_bus 0000:1c: resource 0 [io 0x3000-0x3fff] Jan 23 18:29:37.136226 kernel: pci_bus 0000:1c: resource 1 [mem 0x80c00000-0x80dfffff] Jan 23 18:29:37.136304 kernel: pci_bus 0000:1c: resource 2 [mem 0x38d000000000-0x38d7ffffffff 64bit pref] Jan 23 18:29:37.136387 kernel: pci_bus 0000:1d: resource 0 [io 0x2000-0x2fff] Jan 23 18:29:37.136464 kernel: pci_bus 0000:1d: resource 1 [mem 0x80a00000-0x80bfffff] Jan 23 18:29:37.136543 kernel: pci_bus 0000:1d: resource 2 [mem 0x38d800000000-0x38dfffffffff 64bit pref] Jan 23 18:29:37.137508 kernel: pci_bus 0000:1e: resource 0 [io 0x1000-0x1fff] Jan 23 18:29:37.137673 kernel: pci_bus 0000:1e: resource 1 [mem 0x80800000-0x809fffff] Jan 23 18:29:37.137757 kernel: pci_bus 0000:1e: resource 2 [mem 0x38e000000000-0x38e7ffffffff 64bit pref] Jan 23 18:29:37.137767 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Jan 23 18:29:37.137776 kernel: PCI: CLS 0 bytes, default 64 Jan 23 18:29:37.137788 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Jan 23 18:29:37.137795 kernel: software IO TLB: mapped [mem 0x0000000077ede000-0x000000007bede000] (64MB) Jan 23 18:29:37.137803 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Jan 23 18:29:37.137810 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x21133ac8314, max_idle_ns: 440795303427 ns Jan 23 18:29:37.137818 kernel: Initialise system trusted keyrings Jan 23 18:29:37.137826 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Jan 23 18:29:37.137833 kernel: Key type asymmetric registered Jan 23 18:29:37.137843 kernel: Asymmetric key parser 'x509' registered Jan 23 18:29:37.137850 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Jan 23 18:29:37.137858 kernel: io scheduler mq-deadline registered Jan 23 18:29:37.137865 kernel: io scheduler kyber registered Jan 23 18:29:37.137873 kernel: io scheduler bfq registered Jan 23 18:29:37.137967 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24 Jan 23 18:29:37.138056 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24 Jan 23 18:29:37.138149 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25 Jan 23 18:29:37.138237 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25 Jan 23 18:29:37.138329 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26 Jan 23 18:29:37.138421 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26 Jan 23 18:29:37.138514 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27 Jan 23 18:29:37.138615 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27 Jan 23 18:29:37.138709 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28 Jan 23 18:29:37.138795 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28 Jan 23 18:29:37.138881 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29 Jan 23 18:29:37.138966 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29 Jan 23 18:29:37.139055 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30 Jan 23 18:29:37.139142 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30 Jan 23 18:29:37.139227 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31 Jan 23 18:29:37.139314 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31 Jan 23 18:29:37.139324 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Jan 23 18:29:37.139417 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 32 Jan 23 18:29:37.139508 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 32 Jan 23 18:29:37.139608 kernel: pcieport 0000:00:03.1: PME: Signaling with IRQ 33 Jan 23 18:29:37.139695 kernel: pcieport 0000:00:03.1: AER: enabled with IRQ 33 Jan 23 18:29:37.139782 kernel: pcieport 0000:00:03.2: PME: Signaling with IRQ 34 Jan 23 18:29:37.139869 kernel: pcieport 0000:00:03.2: AER: enabled with IRQ 34 Jan 23 18:29:37.139958 kernel: pcieport 0000:00:03.3: PME: Signaling with IRQ 35 Jan 23 18:29:37.140048 kernel: pcieport 0000:00:03.3: AER: enabled with IRQ 35 Jan 23 18:29:37.140138 kernel: pcieport 0000:00:03.4: PME: Signaling with IRQ 36 Jan 23 18:29:37.140229 kernel: pcieport 0000:00:03.4: AER: enabled with IRQ 36 Jan 23 18:29:37.140318 kernel: pcieport 0000:00:03.5: PME: Signaling with IRQ 37 Jan 23 18:29:37.140402 kernel: pcieport 0000:00:03.5: AER: enabled with IRQ 37 Jan 23 18:29:37.140487 kernel: pcieport 0000:00:03.6: PME: Signaling with IRQ 38 Jan 23 18:29:37.140571 kernel: pcieport 0000:00:03.6: AER: enabled with IRQ 38 Jan 23 18:29:37.140664 kernel: pcieport 0000:00:03.7: PME: Signaling with IRQ 39 Jan 23 18:29:37.140752 kernel: pcieport 0000:00:03.7: AER: enabled with IRQ 39 Jan 23 18:29:37.140762 kernel: ACPI: \_SB_.GSIE: Enabled at IRQ 20 Jan 23 18:29:37.140846 kernel: pcieport 0000:00:04.0: PME: Signaling with IRQ 40 Jan 23 18:29:37.140932 kernel: pcieport 0000:00:04.0: AER: enabled with IRQ 40 Jan 23 18:29:37.141022 kernel: pcieport 0000:00:04.1: PME: Signaling with IRQ 41 Jan 23 18:29:37.141113 kernel: pcieport 0000:00:04.1: AER: enabled with IRQ 41 Jan 23 18:29:37.141203 kernel: pcieport 0000:00:04.2: PME: Signaling with IRQ 42 Jan 23 18:29:37.141296 kernel: pcieport 0000:00:04.2: AER: enabled with IRQ 42 Jan 23 18:29:37.141386 kernel: pcieport 0000:00:04.3: PME: Signaling with IRQ 43 Jan 23 18:29:37.141476 kernel: pcieport 0000:00:04.3: AER: enabled with IRQ 43 Jan 23 18:29:37.141582 kernel: pcieport 0000:00:04.4: PME: Signaling with IRQ 44 Jan 23 18:29:37.141687 kernel: pcieport 0000:00:04.4: AER: enabled with IRQ 44 Jan 23 18:29:37.141779 kernel: pcieport 0000:00:04.5: PME: Signaling with IRQ 45 Jan 23 18:29:37.141872 kernel: pcieport 0000:00:04.5: AER: enabled with IRQ 45 Jan 23 18:29:37.141967 kernel: pcieport 0000:00:04.6: PME: Signaling with IRQ 46 Jan 23 18:29:37.142060 kernel: pcieport 0000:00:04.6: AER: enabled with IRQ 46 Jan 23 18:29:37.142152 kernel: pcieport 0000:00:04.7: PME: Signaling with IRQ 47 Jan 23 18:29:37.142245 kernel: pcieport 0000:00:04.7: AER: enabled with IRQ 47 Jan 23 18:29:37.142256 kernel: ACPI: \_SB_.GSIF: Enabled at IRQ 21 Jan 23 18:29:37.142347 kernel: pcieport 0000:00:05.0: PME: Signaling with IRQ 48 Jan 23 18:29:37.142442 kernel: pcieport 0000:00:05.0: AER: enabled with IRQ 48 Jan 23 18:29:37.142536 kernel: pcieport 0000:00:05.1: PME: Signaling with IRQ 49 Jan 23 18:29:37.142886 kernel: pcieport 0000:00:05.1: AER: enabled with IRQ 49 Jan 23 18:29:37.142990 kernel: pcieport 0000:00:05.2: PME: Signaling with IRQ 50 Jan 23 18:29:37.143085 kernel: pcieport 0000:00:05.2: AER: enabled with IRQ 50 Jan 23 18:29:37.143179 kernel: pcieport 0000:00:05.3: PME: Signaling with IRQ 51 Jan 23 18:29:37.143272 kernel: pcieport 0000:00:05.3: AER: enabled with IRQ 51 Jan 23 18:29:37.143733 kernel: pcieport 0000:00:05.4: PME: Signaling with IRQ 52 Jan 23 18:29:37.143834 kernel: pcieport 0000:00:05.4: AER: enabled with IRQ 52 Jan 23 18:29:37.143845 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Jan 23 18:29:37.143854 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Jan 23 18:29:37.143863 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Jan 23 18:29:37.143871 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Jan 23 18:29:37.143880 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Jan 23 18:29:37.143891 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Jan 23 18:29:37.143992 kernel: rtc_cmos 00:03: RTC can wake from S4 Jan 23 18:29:37.144082 kernel: rtc_cmos 00:03: registered as rtc0 Jan 23 18:29:37.144167 kernel: rtc_cmos 00:03: setting system clock to 2026-01-23T18:29:35 UTC (1769192975) Jan 23 18:29:37.144255 kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram Jan 23 18:29:37.144265 kernel: intel_pstate: CPU model not supported Jan 23 18:29:37.144276 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1 Jan 23 18:29:37.144284 kernel: efifb: probing for efifb Jan 23 18:29:37.144293 kernel: efifb: framebuffer at 0x80000000, using 4000k, total 4000k Jan 23 18:29:37.144301 kernel: efifb: mode is 1280x800x32, linelength=5120, pages=1 Jan 23 18:29:37.144309 kernel: efifb: scrolling: redraw Jan 23 18:29:37.144317 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 Jan 23 18:29:37.144325 kernel: Console: switching to colour frame buffer device 160x50 Jan 23 18:29:37.144335 kernel: fb0: EFI VGA frame buffer device Jan 23 18:29:37.144343 kernel: pstore: Using crash dump compression: deflate Jan 23 18:29:37.144351 kernel: pstore: Registered efi_pstore as persistent store backend Jan 23 18:29:37.144359 kernel: NET: Registered PF_INET6 protocol family Jan 23 18:29:37.144367 kernel: Segment Routing with IPv6 Jan 23 18:29:37.144375 kernel: In-situ OAM (IOAM) with IPv6 Jan 23 18:29:37.144383 kernel: NET: Registered PF_PACKET protocol family Jan 23 18:29:37.144394 kernel: Key type dns_resolver registered Jan 23 18:29:37.144402 kernel: IPI shorthand broadcast: enabled Jan 23 18:29:37.144410 kernel: sched_clock: Marking stable (2313001777, 144416218)->(2551526514, -94108519) Jan 23 18:29:37.144418 kernel: registered taskstats version 1 Jan 23 18:29:37.144426 kernel: Loading compiled-in X.509 certificates Jan 23 18:29:37.144434 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.66-flatcar: ed4528912f8413ae803010e63385bcf7ed197cf1' Jan 23 18:29:37.144442 kernel: Demotion targets for Node 0: null Jan 23 18:29:37.144451 kernel: Key type .fscrypt registered Jan 23 18:29:37.144459 kernel: Key type fscrypt-provisioning registered Jan 23 18:29:37.144467 kernel: ima: No TPM chip found, activating TPM-bypass! Jan 23 18:29:37.144475 kernel: ima: Allocated hash algorithm: sha1 Jan 23 18:29:37.144483 kernel: ima: No architecture policies found Jan 23 18:29:37.144490 kernel: clk: Disabling unused clocks Jan 23 18:29:37.144498 kernel: Freeing unused kernel image (initmem) memory: 15532K Jan 23 18:29:37.144506 kernel: Write protecting the kernel read-only data: 47104k Jan 23 18:29:37.144515 kernel: Freeing unused kernel image (rodata/data gap) memory: 1132K Jan 23 18:29:37.144523 kernel: Run /init as init process Jan 23 18:29:37.144531 kernel: with arguments: Jan 23 18:29:37.144540 kernel: /init Jan 23 18:29:37.144548 kernel: with environment: Jan 23 18:29:37.144555 kernel: HOME=/ Jan 23 18:29:37.144563 kernel: TERM=linux Jan 23 18:29:37.144572 kernel: SCSI subsystem initialized Jan 23 18:29:37.144580 kernel: libata version 3.00 loaded. Jan 23 18:29:37.144688 kernel: ahci 0000:00:1f.2: version 3.0 Jan 23 18:29:37.144699 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Jan 23 18:29:37.144789 kernel: ahci 0000:00:1f.2: AHCI vers 0001.0000, 32 command slots, 1.5 Gbps, SATA mode Jan 23 18:29:37.144879 kernel: ahci 0000:00:1f.2: 6/6 ports implemented (port mask 0x3f) Jan 23 18:29:37.144970 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Jan 23 18:29:37.145077 kernel: scsi host0: ahci Jan 23 18:29:37.145175 kernel: scsi host1: ahci Jan 23 18:29:37.145290 kernel: scsi host2: ahci Jan 23 18:29:37.145387 kernel: scsi host3: ahci Jan 23 18:29:37.146976 kernel: scsi host4: ahci Jan 23 18:29:37.147096 kernel: scsi host5: ahci Jan 23 18:29:37.147107 kernel: ata1: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380100 irq 55 lpm-pol 1 Jan 23 18:29:37.147116 kernel: ata2: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380180 irq 55 lpm-pol 1 Jan 23 18:29:37.147124 kernel: ata3: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380200 irq 55 lpm-pol 1 Jan 23 18:29:37.147131 kernel: ata4: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380280 irq 55 lpm-pol 1 Jan 23 18:29:37.147139 kernel: ata5: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380300 irq 55 lpm-pol 1 Jan 23 18:29:37.147149 kernel: ata6: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380380 irq 55 lpm-pol 1 Jan 23 18:29:37.147156 kernel: ata2: SATA link down (SStatus 0 SControl 300) Jan 23 18:29:37.147164 kernel: ata1: SATA link down (SStatus 0 SControl 300) Jan 23 18:29:37.147172 kernel: ata5: SATA link down (SStatus 0 SControl 300) Jan 23 18:29:37.147179 kernel: ata4: SATA link down (SStatus 0 SControl 300) Jan 23 18:29:37.147188 kernel: ata6: SATA link down (SStatus 0 SControl 300) Jan 23 18:29:37.147195 kernel: ata3: SATA link down (SStatus 0 SControl 300) Jan 23 18:29:37.147202 kernel: ACPI: bus type USB registered Jan 23 18:29:37.147211 kernel: usbcore: registered new interface driver usbfs Jan 23 18:29:37.147219 kernel: usbcore: registered new interface driver hub Jan 23 18:29:37.147227 kernel: usbcore: registered new device driver usb Jan 23 18:29:37.147320 kernel: uhci_hcd 0000:02:01.0: UHCI Host Controller Jan 23 18:29:37.147407 kernel: uhci_hcd 0000:02:01.0: new USB bus registered, assigned bus number 1 Jan 23 18:29:37.147493 kernel: uhci_hcd 0000:02:01.0: detected 2 ports Jan 23 18:29:37.147578 kernel: uhci_hcd 0000:02:01.0: irq 22, io port 0x00006000 Jan 23 18:29:37.147714 kernel: hub 1-0:1.0: USB hub found Jan 23 18:29:37.147806 kernel: hub 1-0:1.0: 2 ports detected Jan 23 18:29:37.147897 kernel: virtio_blk virtio2: 2/0/0 default/read/poll queues Jan 23 18:29:37.147980 kernel: virtio_blk virtio2: [vda] 104857600 512-byte logical blocks (53.7 GB/50.0 GiB) Jan 23 18:29:37.147989 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Jan 23 18:29:37.148000 kernel: GPT:25804799 != 104857599 Jan 23 18:29:37.148008 kernel: GPT:Alternate GPT header not at the end of the disk. Jan 23 18:29:37.148015 kernel: GPT:25804799 != 104857599 Jan 23 18:29:37.148022 kernel: GPT: Use GNU Parted to correct GPT errors. Jan 23 18:29:37.148029 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Jan 23 18:29:37.148036 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Jan 23 18:29:37.148046 kernel: device-mapper: uevent: version 1.0.3 Jan 23 18:29:37.148053 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Jan 23 18:29:37.148060 kernel: device-mapper: verity: sha256 using shash "sha256-generic" Jan 23 18:29:37.148068 kernel: raid6: avx512x4 gen() 37658 MB/s Jan 23 18:29:37.148075 kernel: raid6: avx512x2 gen() 37888 MB/s Jan 23 18:29:37.148083 kernel: raid6: avx512x1 gen() 37684 MB/s Jan 23 18:29:37.148090 kernel: raid6: avx2x4 gen() 30404 MB/s Jan 23 18:29:37.148097 kernel: raid6: avx2x2 gen() 30583 MB/s Jan 23 18:29:37.148105 kernel: raid6: avx2x1 gen() 27549 MB/s Jan 23 18:29:37.148113 kernel: raid6: using algorithm avx512x2 gen() 37888 MB/s Jan 23 18:29:37.148120 kernel: raid6: .... xor() 28606 MB/s, rmw enabled Jan 23 18:29:37.148129 kernel: raid6: using avx512x2 recovery algorithm Jan 23 18:29:37.148136 kernel: xor: automatically using best checksumming function avx Jan 23 18:29:37.148143 kernel: Btrfs loaded, zoned=no, fsverity=no Jan 23 18:29:37.148152 kernel: BTRFS: device fsid ae5f9861-c401-42b4-99c9-2e3fe0b343c2 devid 1 transid 34 /dev/mapper/usr (253:0) scanned by mount (205) Jan 23 18:29:37.148160 kernel: BTRFS info (device dm-0): first mount of filesystem ae5f9861-c401-42b4-99c9-2e3fe0b343c2 Jan 23 18:29:37.148168 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Jan 23 18:29:37.148272 kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd Jan 23 18:29:37.148286 kernel: BTRFS info (device dm-0): disabling log replay at mount time Jan 23 18:29:37.148294 kernel: BTRFS info (device dm-0): enabling free space tree Jan 23 18:29:37.148304 kernel: loop: module loaded Jan 23 18:29:37.148311 kernel: hid: raw HID events driver (C) Jiri Kosina Jan 23 18:29:37.148318 kernel: loop0: detected capacity change from 0 to 100560 Jan 23 18:29:37.148326 kernel: usbcore: registered new interface driver usbhid Jan 23 18:29:37.148333 kernel: usbhid: USB HID core driver Jan 23 18:29:37.148341 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Jan 23 18:29:37.148348 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.0/0000:01:00.0/0000:02:01.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input3 Jan 23 18:29:37.148458 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:01.0-1/input0 Jan 23 18:29:37.148469 systemd[1]: Successfully made /usr/ read-only. Jan 23 18:29:37.148480 systemd[1]: systemd 257.9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +IPE +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -BTF -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jan 23 18:29:37.148489 systemd[1]: Detected virtualization kvm. Jan 23 18:29:37.148497 systemd[1]: Detected architecture x86-64. Jan 23 18:29:37.148506 systemd[1]: Running in initrd. Jan 23 18:29:37.148513 systemd[1]: No hostname configured, using default hostname. Jan 23 18:29:37.148522 systemd[1]: Hostname set to . Jan 23 18:29:37.148529 systemd[1]: Initializing machine ID from SMBIOS/DMI UUID. Jan 23 18:29:37.148537 systemd[1]: Queued start job for default target initrd.target. Jan 23 18:29:37.148544 systemd[1]: Unnecessary job was removed for dev-mapper-usr.device - /dev/mapper/usr. Jan 23 18:29:37.148552 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 23 18:29:37.148561 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 23 18:29:37.148569 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Jan 23 18:29:37.148577 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 23 18:29:37.148585 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Jan 23 18:29:37.148593 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Jan 23 18:29:37.148623 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 23 18:29:37.148631 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 23 18:29:37.148639 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Jan 23 18:29:37.148646 systemd[1]: Reached target paths.target - Path Units. Jan 23 18:29:37.148654 systemd[1]: Reached target slices.target - Slice Units. Jan 23 18:29:37.148661 systemd[1]: Reached target swap.target - Swaps. Jan 23 18:29:37.148669 systemd[1]: Reached target timers.target - Timer Units. Jan 23 18:29:37.148678 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Jan 23 18:29:37.148686 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 23 18:29:37.148693 systemd[1]: Listening on systemd-journald-audit.socket - Journal Audit Socket. Jan 23 18:29:37.148701 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Jan 23 18:29:37.148709 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Jan 23 18:29:37.148716 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 23 18:29:37.148724 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 23 18:29:37.148733 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 23 18:29:37.148741 systemd[1]: Reached target sockets.target - Socket Units. Jan 23 18:29:37.148749 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Jan 23 18:29:37.148756 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Jan 23 18:29:37.148764 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 23 18:29:37.148771 systemd[1]: Finished network-cleanup.service - Network Cleanup. Jan 23 18:29:37.148779 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Jan 23 18:29:37.148789 systemd[1]: Starting systemd-fsck-usr.service... Jan 23 18:29:37.148797 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 23 18:29:37.148805 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 23 18:29:37.148813 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 23 18:29:37.148823 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Jan 23 18:29:37.148830 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 23 18:29:37.148838 systemd[1]: Finished systemd-fsck-usr.service. Jan 23 18:29:37.148865 systemd-journald[346]: Collecting audit messages is enabled. Jan 23 18:29:37.148887 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jan 23 18:29:37.148895 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 23 18:29:37.148904 kernel: audit: type=1130 audit(1769192977.080:2): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:37.148912 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 23 18:29:37.148920 kernel: audit: type=1130 audit(1769192977.089:3): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:37.148930 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Jan 23 18:29:37.148938 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 23 18:29:37.148946 kernel: Bridge firewalling registered Jan 23 18:29:37.148954 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 23 18:29:37.148963 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 23 18:29:37.148971 kernel: audit: type=1130 audit(1769192977.117:4): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:37.148981 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 23 18:29:37.148989 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 23 18:29:37.148997 kernel: audit: type=1130 audit(1769192977.136:5): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:37.149005 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 23 18:29:37.149013 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 23 18:29:37.149023 systemd-journald[346]: Journal started Jan 23 18:29:37.149043 systemd-journald[346]: Runtime Journal (/run/log/journal/5a102a0d3245405fa2afa4cae2c0142c) is 8M, max 77.9M, 69.9M free. Jan 23 18:29:37.080000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:37.089000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:37.117000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:37.136000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:37.102342 systemd-modules-load[347]: Inserted module 'br_netfilter' Jan 23 18:29:37.153050 kernel: audit: type=1130 audit(1769192977.147:6): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:37.147000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:37.152000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:37.154619 kernel: audit: type=1130 audit(1769192977.152:7): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:37.154640 systemd[1]: Started systemd-journald.service - Journal Service. Jan 23 18:29:37.161636 kernel: audit: type=1130 audit(1769192977.157:8): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:37.157000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:37.162071 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Jan 23 18:29:37.162000 audit: BPF prog-id=6 op=LOAD Jan 23 18:29:37.163437 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 23 18:29:37.166614 kernel: audit: type=1334 audit(1769192977.162:9): prog-id=6 op=LOAD Jan 23 18:29:37.166728 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 23 18:29:37.182939 systemd-tmpfiles[381]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Jan 23 18:29:37.187839 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 23 18:29:37.193782 kernel: audit: type=1130 audit(1769192977.187:10): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:37.187000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:37.195681 dracut-cmdline[379]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack verity.usrhash=ee2a61adbfdca0d8850a6d1564f6a5daa8e67e4645be01ed76a79270fe7c1051 Jan 23 18:29:37.217826 systemd-resolved[380]: Positive Trust Anchors: Jan 23 18:29:37.217839 systemd-resolved[380]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 23 18:29:37.217842 systemd-resolved[380]: . IN DS 38696 8 2 683d2d0acb8c9b712a1948b27f741219298d0a450d612c483af444a4c0fb2b16 Jan 23 18:29:37.217872 systemd-resolved[380]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 23 18:29:37.241364 systemd-resolved[380]: Defaulting to hostname 'linux'. Jan 23 18:29:37.242198 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 23 18:29:37.242000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:37.242858 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 23 18:29:37.293731 kernel: Loading iSCSI transport class v2.0-870. Jan 23 18:29:37.307645 kernel: iscsi: registered transport (tcp) Jan 23 18:29:37.330068 kernel: iscsi: registered transport (qla4xxx) Jan 23 18:29:37.330130 kernel: QLogic iSCSI HBA Driver Jan 23 18:29:37.353186 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jan 23 18:29:37.370368 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jan 23 18:29:37.371000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:37.373382 systemd[1]: Reached target network-pre.target - Preparation for Network. Jan 23 18:29:37.409135 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Jan 23 18:29:37.409000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:37.411381 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Jan 23 18:29:37.412821 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Jan 23 18:29:37.443965 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Jan 23 18:29:37.444000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:37.444000 audit: BPF prog-id=7 op=LOAD Jan 23 18:29:37.444000 audit: BPF prog-id=8 op=LOAD Jan 23 18:29:37.447737 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 23 18:29:37.470727 systemd-udevd[622]: Using default interface naming scheme 'v257'. Jan 23 18:29:37.479888 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 23 18:29:37.481000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:37.484138 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Jan 23 18:29:37.502541 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 23 18:29:37.502000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:37.504000 audit: BPF prog-id=9 op=LOAD Jan 23 18:29:37.505110 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 23 18:29:37.511382 dracut-pre-trigger[709]: rd.md=0: removing MD RAID activation Jan 23 18:29:37.534018 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Jan 23 18:29:37.533000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:37.536741 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 23 18:29:37.549280 systemd-networkd[732]: lo: Link UP Jan 23 18:29:37.549904 systemd-networkd[732]: lo: Gained carrier Jan 23 18:29:37.552000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:37.550284 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 23 18:29:37.552715 systemd[1]: Reached target network.target - Network. Jan 23 18:29:37.619764 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 23 18:29:37.621000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:37.624313 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Jan 23 18:29:37.722972 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Jan 23 18:29:37.739508 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Jan 23 18:29:37.748335 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Jan 23 18:29:37.757416 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Jan 23 18:29:37.760176 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Jan 23 18:29:37.775312 kernel: cryptd: max_cpu_qlen set to 1000 Jan 23 18:29:37.794620 disk-uuid[805]: Primary Header is updated. Jan 23 18:29:37.794620 disk-uuid[805]: Secondary Entries is updated. Jan 23 18:29:37.794620 disk-uuid[805]: Secondary Header is updated. Jan 23 18:29:37.795989 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 23 18:29:37.800000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:37.796086 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 23 18:29:37.800679 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Jan 23 18:29:37.801764 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 23 18:29:37.815148 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 23 18:29:37.815000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:37.815000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:37.815231 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 23 18:29:37.816740 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 23 18:29:37.826972 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input4 Jan 23 18:29:37.827398 systemd-networkd[732]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 23 18:29:37.827406 systemd-networkd[732]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 23 18:29:37.830112 systemd-networkd[732]: eth0: Link UP Jan 23 18:29:37.831404 systemd-networkd[732]: eth0: Gained carrier Jan 23 18:29:37.831417 systemd-networkd[732]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 23 18:29:37.845053 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 23 18:29:37.845000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:37.853641 kernel: AES CTR mode by8 optimization enabled Jan 23 18:29:37.874393 systemd-networkd[732]: eth0: DHCPv4 address 10.0.9.101/25, gateway 10.0.9.1 acquired from 10.0.9.1 Jan 23 18:29:37.945662 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Jan 23 18:29:37.945000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:37.946597 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Jan 23 18:29:37.947094 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 23 18:29:37.947853 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 23 18:29:37.949420 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Jan 23 18:29:37.964834 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Jan 23 18:29:37.964000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:38.864412 disk-uuid[806]: Warning: The kernel is still using the old partition table. Jan 23 18:29:38.864412 disk-uuid[806]: The new table will be used at the next reboot or after you Jan 23 18:29:38.864412 disk-uuid[806]: run partprobe(8) or kpartx(8) Jan 23 18:29:38.864412 disk-uuid[806]: The operation has completed successfully. Jan 23 18:29:38.869988 systemd[1]: disk-uuid.service: Deactivated successfully. Jan 23 18:29:38.877438 kernel: kauditd_printk_skb: 18 callbacks suppressed Jan 23 18:29:38.877467 kernel: audit: type=1130 audit(1769192978.869:29): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:38.877480 kernel: audit: type=1131 audit(1769192978.870:30): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:38.869000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:38.870000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:38.870099 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Jan 23 18:29:38.872726 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Jan 23 18:29:38.907769 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (921) Jan 23 18:29:38.910027 kernel: BTRFS info (device vda6): first mount of filesystem 65a96faf-6d02-485d-b2fc-84eb49ece660 Jan 23 18:29:38.910062 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Jan 23 18:29:38.915275 kernel: BTRFS info (device vda6): turning on async discard Jan 23 18:29:38.915323 kernel: BTRFS info (device vda6): enabling free space tree Jan 23 18:29:38.921624 kernel: BTRFS info (device vda6): last unmount of filesystem 65a96faf-6d02-485d-b2fc-84eb49ece660 Jan 23 18:29:38.921721 systemd[1]: Finished ignition-setup.service - Ignition (setup). Jan 23 18:29:38.925747 kernel: audit: type=1130 audit(1769192978.921:31): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:38.921000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:38.923015 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Jan 23 18:29:39.103643 ignition[940]: Ignition 2.24.0 Jan 23 18:29:39.103653 ignition[940]: Stage: fetch-offline Jan 23 18:29:39.103698 ignition[940]: no configs at "/usr/lib/ignition/base.d" Jan 23 18:29:39.103719 ignition[940]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 23 18:29:39.104334 ignition[940]: parsed url from cmdline: "" Jan 23 18:29:39.107324 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Jan 23 18:29:39.104337 ignition[940]: no config URL provided Jan 23 18:29:39.104397 ignition[940]: reading system config file "/usr/lib/ignition/user.ign" Jan 23 18:29:39.104409 ignition[940]: no config at "/usr/lib/ignition/user.ign" Jan 23 18:29:39.104413 ignition[940]: failed to fetch config: resource requires networking Jan 23 18:29:39.105260 ignition[940]: Ignition finished successfully Jan 23 18:29:39.114115 kernel: audit: type=1130 audit(1769192979.109:32): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:39.109000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:39.110721 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Jan 23 18:29:39.126464 ignition[947]: Ignition 2.24.0 Jan 23 18:29:39.126473 ignition[947]: Stage: fetch Jan 23 18:29:39.126596 ignition[947]: no configs at "/usr/lib/ignition/base.d" Jan 23 18:29:39.126614 ignition[947]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 23 18:29:39.126680 ignition[947]: parsed url from cmdline: "" Jan 23 18:29:39.126683 ignition[947]: no config URL provided Jan 23 18:29:39.126687 ignition[947]: reading system config file "/usr/lib/ignition/user.ign" Jan 23 18:29:39.126693 ignition[947]: no config at "/usr/lib/ignition/user.ign" Jan 23 18:29:39.126758 ignition[947]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 Jan 23 18:29:39.127270 ignition[947]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Jan 23 18:29:39.127289 ignition[947]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Jan 23 18:29:39.246093 systemd-networkd[732]: eth0: Gained IPv6LL Jan 23 18:29:40.127460 ignition[947]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Jan 23 18:29:40.127491 ignition[947]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Jan 23 18:29:41.127656 ignition[947]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Jan 23 18:29:41.127713 ignition[947]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Jan 23 18:29:42.127849 ignition[947]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Jan 23 18:29:42.127964 ignition[947]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Jan 23 18:29:42.668427 ignition[947]: GET result: OK Jan 23 18:29:42.668514 ignition[947]: parsing config with SHA512: b45ecea9d7855e85790039b3e0697b18456fe992b667cbec0139bda76457807b47f1a9ba2496e8fa5bce7d95a46d48194a645e3bf45db14d7db612b7a856a612 Jan 23 18:29:42.673000 unknown[947]: fetched base config from "system" Jan 23 18:29:42.673267 ignition[947]: fetch: fetch complete Jan 23 18:29:42.673009 unknown[947]: fetched base config from "system" Jan 23 18:29:42.673271 ignition[947]: fetch: fetch passed Jan 23 18:29:42.673014 unknown[947]: fetched user config from "openstack" Jan 23 18:29:42.673307 ignition[947]: Ignition finished successfully Jan 23 18:29:42.674921 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Jan 23 18:29:42.679636 kernel: audit: type=1130 audit(1769192982.674:33): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:42.674000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:42.677752 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Jan 23 18:29:42.710840 ignition[953]: Ignition 2.24.0 Jan 23 18:29:42.710850 ignition[953]: Stage: kargs Jan 23 18:29:42.710989 ignition[953]: no configs at "/usr/lib/ignition/base.d" Jan 23 18:29:42.712000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:42.713110 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Jan 23 18:29:42.716650 kernel: audit: type=1130 audit(1769192982.712:34): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:42.710997 ignition[953]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 23 18:29:42.715712 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Jan 23 18:29:42.711732 ignition[953]: kargs: kargs passed Jan 23 18:29:42.711765 ignition[953]: Ignition finished successfully Jan 23 18:29:42.737124 ignition[959]: Ignition 2.24.0 Jan 23 18:29:42.737133 ignition[959]: Stage: disks Jan 23 18:29:42.737275 ignition[959]: no configs at "/usr/lib/ignition/base.d" Jan 23 18:29:42.738810 systemd[1]: Finished ignition-disks.service - Ignition (disks). Jan 23 18:29:42.737283 ignition[959]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 23 18:29:42.742613 kernel: audit: type=1130 audit(1769192982.738:35): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:42.738000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:42.738025 ignition[959]: disks: disks passed Jan 23 18:29:42.739796 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Jan 23 18:29:42.738055 ignition[959]: Ignition finished successfully Jan 23 18:29:42.742913 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Jan 23 18:29:42.743449 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 23 18:29:42.743998 systemd[1]: Reached target sysinit.target - System Initialization. Jan 23 18:29:42.744532 systemd[1]: Reached target basic.target - Basic System. Jan 23 18:29:42.746701 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Jan 23 18:29:42.783640 systemd-fsck[967]: ROOT: clean, 15/1631200 files, 112378/1617920 blocks Jan 23 18:29:42.785537 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Jan 23 18:29:42.792209 kernel: audit: type=1130 audit(1769192982.785:36): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:42.785000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:42.788759 systemd[1]: Mounting sysroot.mount - /sysroot... Jan 23 18:29:42.910626 kernel: EXT4-fs (vda9): mounted filesystem eebf2bdd-2461-4b18-9f37-721daf86511d r/w with ordered data mode. Quota mode: none. Jan 23 18:29:42.911534 systemd[1]: Mounted sysroot.mount - /sysroot. Jan 23 18:29:42.912487 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Jan 23 18:29:42.915194 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 23 18:29:42.917838 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Jan 23 18:29:42.918409 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Jan 23 18:29:42.918935 systemd[1]: Starting flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent... Jan 23 18:29:42.920004 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Jan 23 18:29:42.920037 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Jan 23 18:29:42.929417 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Jan 23 18:29:42.931068 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Jan 23 18:29:42.946631 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (976) Jan 23 18:29:42.950399 kernel: BTRFS info (device vda6): first mount of filesystem 65a96faf-6d02-485d-b2fc-84eb49ece660 Jan 23 18:29:42.950438 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Jan 23 18:29:42.965248 kernel: BTRFS info (device vda6): turning on async discard Jan 23 18:29:42.965293 kernel: BTRFS info (device vda6): enabling free space tree Jan 23 18:29:42.968904 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 23 18:29:43.025625 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 23 18:29:43.137084 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Jan 23 18:29:43.137000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:43.142231 kernel: audit: type=1130 audit(1769192983.137:37): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:43.142350 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Jan 23 18:29:43.143848 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Jan 23 18:29:43.155820 systemd[1]: sysroot-oem.mount: Deactivated successfully. Jan 23 18:29:43.157946 kernel: BTRFS info (device vda6): last unmount of filesystem 65a96faf-6d02-485d-b2fc-84eb49ece660 Jan 23 18:29:43.178482 ignition[1076]: INFO : Ignition 2.24.0 Jan 23 18:29:43.180337 ignition[1076]: INFO : Stage: mount Jan 23 18:29:43.180000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:43.179796 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Jan 23 18:29:43.184224 ignition[1076]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 23 18:29:43.184224 ignition[1076]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 23 18:29:43.184224 ignition[1076]: INFO : mount: mount passed Jan 23 18:29:43.184224 ignition[1076]: INFO : Ignition finished successfully Jan 23 18:29:43.186156 kernel: audit: type=1130 audit(1769192983.180:38): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:43.185000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:43.185114 systemd[1]: Finished ignition-mount.service - Ignition (mount). Jan 23 18:29:44.060650 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 23 18:29:46.069624 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 23 18:29:50.075652 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 23 18:29:50.081789 coreos-metadata[978]: Jan 23 18:29:50.081 WARN failed to locate config-drive, using the metadata service API instead Jan 23 18:29:50.100210 coreos-metadata[978]: Jan 23 18:29:50.100 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Jan 23 18:29:50.695011 coreos-metadata[978]: Jan 23 18:29:50.694 INFO Fetch successful Jan 23 18:29:50.695011 coreos-metadata[978]: Jan 23 18:29:50.694 INFO wrote hostname ci-4547-1-0-2-32611d5cc2 to /sysroot/etc/hostname Jan 23 18:29:50.696894 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. Jan 23 18:29:50.706152 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 23 18:29:50.706180 kernel: audit: type=1130 audit(1769192990.696:40): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:50.706194 kernel: audit: type=1131 audit(1769192990.698:41): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:50.696000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:50.698000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:50.696995 systemd[1]: Finished flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent. Jan 23 18:29:50.701745 systemd[1]: Starting ignition-files.service - Ignition (files)... Jan 23 18:29:50.716054 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 23 18:29:50.746634 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (1094) Jan 23 18:29:50.750271 kernel: BTRFS info (device vda6): first mount of filesystem 65a96faf-6d02-485d-b2fc-84eb49ece660 Jan 23 18:29:50.750325 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Jan 23 18:29:50.756118 kernel: BTRFS info (device vda6): turning on async discard Jan 23 18:29:50.756157 kernel: BTRFS info (device vda6): enabling free space tree Jan 23 18:29:50.757734 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 23 18:29:50.783293 ignition[1112]: INFO : Ignition 2.24.0 Jan 23 18:29:50.783293 ignition[1112]: INFO : Stage: files Jan 23 18:29:50.784728 ignition[1112]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 23 18:29:50.784728 ignition[1112]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 23 18:29:50.784728 ignition[1112]: DEBUG : files: compiled without relabeling support, skipping Jan 23 18:29:50.786020 ignition[1112]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Jan 23 18:29:50.786020 ignition[1112]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Jan 23 18:29:50.789253 ignition[1112]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Jan 23 18:29:50.789880 ignition[1112]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Jan 23 18:29:50.790408 unknown[1112]: wrote ssh authorized keys file for user: core Jan 23 18:29:50.791027 ignition[1112]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Jan 23 18:29:50.809460 ignition[1112]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Jan 23 18:29:50.810587 ignition[1112]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.0-linux-amd64.tar.gz: attempt #1 Jan 23 18:29:50.852574 ignition[1112]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Jan 23 18:29:50.963877 ignition[1112]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Jan 23 18:29:50.963877 ignition[1112]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Jan 23 18:29:50.966965 ignition[1112]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Jan 23 18:29:50.966965 ignition[1112]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Jan 23 18:29:50.966965 ignition[1112]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Jan 23 18:29:50.966965 ignition[1112]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 23 18:29:50.966965 ignition[1112]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 23 18:29:50.966965 ignition[1112]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 23 18:29:50.966965 ignition[1112]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 23 18:29:50.966965 ignition[1112]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Jan 23 18:29:50.966965 ignition[1112]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Jan 23 18:29:50.966965 ignition[1112]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Jan 23 18:29:50.970855 ignition[1112]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Jan 23 18:29:50.970855 ignition[1112]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Jan 23 18:29:50.970855 ignition[1112]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.32.4-x86-64.raw: attempt #1 Jan 23 18:29:51.216496 ignition[1112]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Jan 23 18:29:52.057838 ignition[1112]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Jan 23 18:29:52.057838 ignition[1112]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Jan 23 18:29:52.059575 ignition[1112]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 23 18:29:52.061598 ignition[1112]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 23 18:29:52.061598 ignition[1112]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Jan 23 18:29:52.063960 ignition[1112]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Jan 23 18:29:52.063960 ignition[1112]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Jan 23 18:29:52.063960 ignition[1112]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Jan 23 18:29:52.063960 ignition[1112]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Jan 23 18:29:52.063960 ignition[1112]: INFO : files: files passed Jan 23 18:29:52.063960 ignition[1112]: INFO : Ignition finished successfully Jan 23 18:29:52.074503 kernel: audit: type=1130 audit(1769192992.063:42): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:52.063000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:52.063504 systemd[1]: Finished ignition-files.service - Ignition (files). Jan 23 18:29:52.066775 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Jan 23 18:29:52.071129 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Jan 23 18:29:52.079820 systemd[1]: ignition-quench.service: Deactivated successfully. Jan 23 18:29:52.079905 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Jan 23 18:29:52.088707 kernel: audit: type=1130 audit(1769192992.080:43): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:52.088738 kernel: audit: type=1131 audit(1769192992.080:44): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:52.080000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:52.080000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:52.088810 initrd-setup-root-after-ignition[1144]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 23 18:29:52.088810 initrd-setup-root-after-ignition[1144]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Jan 23 18:29:52.090919 initrd-setup-root-after-ignition[1148]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 23 18:29:52.092563 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 23 18:29:52.096842 kernel: audit: type=1130 audit(1769192992.092:45): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:52.092000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:52.093372 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Jan 23 18:29:52.098050 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Jan 23 18:29:52.136430 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Jan 23 18:29:52.136536 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Jan 23 18:29:52.144369 kernel: audit: type=1130 audit(1769192992.137:46): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:52.144394 kernel: audit: type=1131 audit(1769192992.137:47): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:52.137000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:52.137000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:52.137909 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Jan 23 18:29:52.144780 systemd[1]: Reached target initrd.target - Initrd Default Target. Jan 23 18:29:52.145879 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Jan 23 18:29:52.146565 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Jan 23 18:29:52.167582 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 23 18:29:52.171624 kernel: audit: type=1130 audit(1769192992.168:48): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:52.168000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:52.173740 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Jan 23 18:29:52.185035 systemd[1]: Unnecessary job was removed for dev-mapper-usr.device - /dev/mapper/usr. Jan 23 18:29:52.185198 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Jan 23 18:29:52.185778 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 23 18:29:52.186732 systemd[1]: Stopped target timers.target - Timer Units. Jan 23 18:29:52.191676 kernel: audit: type=1131 audit(1769192992.187:49): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:52.187000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:52.187498 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Jan 23 18:29:52.187590 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 23 18:29:52.191767 systemd[1]: Stopped target initrd.target - Initrd Default Target. Jan 23 18:29:52.192651 systemd[1]: Stopped target basic.target - Basic System. Jan 23 18:29:52.193622 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Jan 23 18:29:52.194529 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Jan 23 18:29:52.195276 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Jan 23 18:29:52.196054 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Jan 23 18:29:52.196867 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Jan 23 18:29:52.197741 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Jan 23 18:29:52.198570 systemd[1]: Stopped target sysinit.target - System Initialization. Jan 23 18:29:52.199407 systemd[1]: Stopped target local-fs.target - Local File Systems. Jan 23 18:29:52.200225 systemd[1]: Stopped target swap.target - Swaps. Jan 23 18:29:52.201000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:52.201127 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Jan 23 18:29:52.201228 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Jan 23 18:29:52.202490 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Jan 23 18:29:52.202932 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 23 18:29:52.204000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:52.203593 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Jan 23 18:29:52.203664 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 23 18:29:52.205000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:52.204326 systemd[1]: dracut-initqueue.service: Deactivated successfully. Jan 23 18:29:52.206000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:52.204409 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Jan 23 18:29:52.205531 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Jan 23 18:29:52.205625 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 23 18:29:52.206336 systemd[1]: ignition-files.service: Deactivated successfully. Jan 23 18:29:52.206409 systemd[1]: Stopped ignition-files.service - Ignition (files). Jan 23 18:29:52.208738 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Jan 23 18:29:52.209000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:52.209317 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Jan 23 18:29:52.209429 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Jan 23 18:29:52.211779 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Jan 23 18:29:52.212144 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Jan 23 18:29:52.212000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:52.213000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:52.212233 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 23 18:29:52.214000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:52.213642 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Jan 23 18:29:52.213717 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Jan 23 18:29:52.214410 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Jan 23 18:29:52.214484 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Jan 23 18:29:52.220302 systemd[1]: initrd-cleanup.service: Deactivated successfully. Jan 23 18:29:52.220370 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Jan 23 18:29:52.220000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:52.220000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:52.233189 ignition[1168]: INFO : Ignition 2.24.0 Jan 23 18:29:52.234791 ignition[1168]: INFO : Stage: umount Jan 23 18:29:52.234791 ignition[1168]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 23 18:29:52.234791 ignition[1168]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 23 18:29:52.234566 systemd[1]: sysroot-boot.mount: Deactivated successfully. Jan 23 18:29:52.237620 ignition[1168]: INFO : umount: umount passed Jan 23 18:29:52.237620 ignition[1168]: INFO : Ignition finished successfully Jan 23 18:29:52.238908 systemd[1]: ignition-mount.service: Deactivated successfully. Jan 23 18:29:52.239522 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Jan 23 18:29:52.239000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:52.240491 systemd[1]: sysroot-boot.service: Deactivated successfully. Jan 23 18:29:52.240554 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Jan 23 18:29:52.241000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:52.242310 systemd[1]: ignition-disks.service: Deactivated successfully. Jan 23 18:29:52.242363 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Jan 23 18:29:52.242000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:52.243132 systemd[1]: ignition-kargs.service: Deactivated successfully. Jan 23 18:29:52.243000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:52.243167 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Jan 23 18:29:52.243000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:52.243711 systemd[1]: ignition-fetch.service: Deactivated successfully. Jan 23 18:29:52.243744 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Jan 23 18:29:52.245000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:52.244386 systemd[1]: Stopped target network.target - Network. Jan 23 18:29:52.244988 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Jan 23 18:29:52.245032 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Jan 23 18:29:52.245661 systemd[1]: Stopped target paths.target - Path Units. Jan 23 18:29:52.246295 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Jan 23 18:29:52.249647 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 23 18:29:52.250000 systemd[1]: Stopped target slices.target - Slice Units. Jan 23 18:29:52.250593 systemd[1]: Stopped target sockets.target - Socket Units. Jan 23 18:29:52.251214 systemd[1]: iscsid.socket: Deactivated successfully. Jan 23 18:29:52.251243 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Jan 23 18:29:52.251787 systemd[1]: iscsiuio.socket: Deactivated successfully. Jan 23 18:29:52.251811 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 23 18:29:52.252000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:52.252338 systemd[1]: systemd-journald-audit.socket: Deactivated successfully. Jan 23 18:29:52.253000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup-pre comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:52.252356 systemd[1]: Closed systemd-journald-audit.socket - Journal Audit Socket. Jan 23 18:29:52.254000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:52.252910 systemd[1]: ignition-setup.service: Deactivated successfully. Jan 23 18:29:52.252947 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Jan 23 18:29:52.253517 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Jan 23 18:29:52.253548 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Jan 23 18:29:52.254092 systemd[1]: initrd-setup-root.service: Deactivated successfully. Jan 23 18:29:52.254122 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Jan 23 18:29:52.254737 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Jan 23 18:29:52.255419 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Jan 23 18:29:52.262176 systemd[1]: systemd-resolved.service: Deactivated successfully. Jan 23 18:29:52.262264 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Jan 23 18:29:52.262000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:52.265027 systemd[1]: systemd-networkd.service: Deactivated successfully. Jan 23 18:29:52.265109 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Jan 23 18:29:52.265000 audit: BPF prog-id=6 op=UNLOAD Jan 23 18:29:52.265000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:52.267000 audit: BPF prog-id=9 op=UNLOAD Jan 23 18:29:52.267751 systemd[1]: Stopped target network-pre.target - Preparation for Network. Jan 23 18:29:52.268163 systemd[1]: systemd-networkd.socket: Deactivated successfully. Jan 23 18:29:52.268199 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Jan 23 18:29:52.270000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:52.271000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:52.271000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:52.270700 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Jan 23 18:29:52.271004 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Jan 23 18:29:52.271048 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 23 18:29:52.271397 systemd[1]: systemd-sysctl.service: Deactivated successfully. Jan 23 18:29:52.271427 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Jan 23 18:29:52.271774 systemd[1]: systemd-modules-load.service: Deactivated successfully. Jan 23 18:29:52.271802 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Jan 23 18:29:52.273661 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 23 18:29:52.277487 systemd[1]: systemd-udevd.service: Deactivated successfully. Jan 23 18:29:52.277613 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 23 18:29:52.282000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:52.283843 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Jan 23 18:29:52.284285 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Jan 23 18:29:52.285029 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Jan 23 18:29:52.285058 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Jan 23 18:29:52.285379 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Jan 23 18:29:52.285445 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Jan 23 18:29:52.286000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:52.287296 systemd[1]: dracut-cmdline.service: Deactivated successfully. Jan 23 18:29:52.287000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:52.287000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:52.287677 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Jan 23 18:29:52.288081 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jan 23 18:29:52.288118 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 23 18:29:52.291837 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Jan 23 18:29:52.292184 systemd[1]: systemd-network-generator.service: Deactivated successfully. Jan 23 18:29:52.294000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:52.292228 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Jan 23 18:29:52.294000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:52.294691 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Jan 23 18:29:52.295000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:52.294733 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 23 18:29:52.295341 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 23 18:29:52.296000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=network-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:52.295375 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 23 18:29:52.296534 systemd[1]: network-cleanup.service: Deactivated successfully. Jan 23 18:29:52.296661 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Jan 23 18:29:52.305725 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Jan 23 18:29:52.305811 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Jan 23 18:29:52.305000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:52.305000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:52.306790 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Jan 23 18:29:52.307915 systemd[1]: Starting initrd-switch-root.service - Switch Root... Jan 23 18:29:52.325790 systemd[1]: Switching root. Jan 23 18:29:52.362656 systemd-journald[346]: Journal stopped Jan 23 18:29:53.414540 systemd-journald[346]: Received SIGTERM from PID 1 (systemd). Jan 23 18:29:53.414624 kernel: SELinux: policy capability network_peer_controls=1 Jan 23 18:29:53.414642 kernel: SELinux: policy capability open_perms=1 Jan 23 18:29:53.414653 kernel: SELinux: policy capability extended_socket_class=1 Jan 23 18:29:53.414669 kernel: SELinux: policy capability always_check_network=0 Jan 23 18:29:53.414679 kernel: SELinux: policy capability cgroup_seclabel=1 Jan 23 18:29:53.414693 kernel: SELinux: policy capability nnp_nosuid_transition=1 Jan 23 18:29:53.414708 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Jan 23 18:29:53.414719 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Jan 23 18:29:53.414732 kernel: SELinux: policy capability userspace_initial_context=0 Jan 23 18:29:53.414746 systemd[1]: Successfully loaded SELinux policy in 62.237ms. Jan 23 18:29:53.414762 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 5.501ms. Jan 23 18:29:53.414781 systemd[1]: systemd 257.9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +IPE +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -BTF -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jan 23 18:29:53.414794 systemd[1]: Detected virtualization kvm. Jan 23 18:29:53.414806 systemd[1]: Detected architecture x86-64. Jan 23 18:29:53.414817 systemd[1]: Detected first boot. Jan 23 18:29:53.414828 systemd[1]: Hostname set to . Jan 23 18:29:53.414840 systemd[1]: Initializing machine ID from SMBIOS/DMI UUID. Jan 23 18:29:53.414854 zram_generator::config[1211]: No configuration found. Jan 23 18:29:53.414874 kernel: Guest personality initialized and is inactive Jan 23 18:29:53.414885 kernel: VMCI host device registered (name=vmci, major=10, minor=258) Jan 23 18:29:53.414896 kernel: Initialized host personality Jan 23 18:29:53.414906 kernel: NET: Registered PF_VSOCK protocol family Jan 23 18:29:53.414917 systemd[1]: Populated /etc with preset unit settings. Jan 23 18:29:53.414928 systemd[1]: initrd-switch-root.service: Deactivated successfully. Jan 23 18:29:53.414939 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Jan 23 18:29:53.414951 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Jan 23 18:29:53.414966 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Jan 23 18:29:53.414977 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Jan 23 18:29:53.414988 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Jan 23 18:29:53.414999 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Jan 23 18:29:53.415010 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Jan 23 18:29:53.415022 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Jan 23 18:29:53.415035 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Jan 23 18:29:53.415046 systemd[1]: Created slice user.slice - User and Session Slice. Jan 23 18:29:53.415057 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 23 18:29:53.415068 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 23 18:29:53.415080 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Jan 23 18:29:53.415090 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Jan 23 18:29:53.415101 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Jan 23 18:29:53.415114 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 23 18:29:53.415127 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Jan 23 18:29:53.415140 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 23 18:29:53.415151 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 23 18:29:53.415165 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Jan 23 18:29:53.415178 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Jan 23 18:29:53.415189 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Jan 23 18:29:53.415201 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Jan 23 18:29:53.415212 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 23 18:29:53.415224 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 23 18:29:53.415235 systemd[1]: Reached target remote-veritysetup.target - Remote Verity Protected Volumes. Jan 23 18:29:53.415246 systemd[1]: Reached target slices.target - Slice Units. Jan 23 18:29:53.415260 systemd[1]: Reached target swap.target - Swaps. Jan 23 18:29:53.415271 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Jan 23 18:29:53.415283 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Jan 23 18:29:53.415294 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Jan 23 18:29:53.415305 systemd[1]: Listening on systemd-journald-audit.socket - Journal Audit Socket. Jan 23 18:29:53.415317 systemd[1]: Listening on systemd-mountfsd.socket - DDI File System Mounter Socket. Jan 23 18:29:53.415328 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 23 18:29:53.415341 systemd[1]: Listening on systemd-nsresourced.socket - Namespace Resource Manager Socket. Jan 23 18:29:53.415352 systemd[1]: Listening on systemd-oomd.socket - Userspace Out-Of-Memory (OOM) Killer Socket. Jan 23 18:29:53.415363 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 23 18:29:53.415374 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 23 18:29:53.415385 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Jan 23 18:29:53.415396 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Jan 23 18:29:53.415407 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Jan 23 18:29:53.415420 systemd[1]: Mounting media.mount - External Media Directory... Jan 23 18:29:53.415431 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 23 18:29:53.415441 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Jan 23 18:29:53.415453 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Jan 23 18:29:53.415464 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Jan 23 18:29:53.415476 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Jan 23 18:29:53.415489 systemd[1]: Reached target machines.target - Containers. Jan 23 18:29:53.415501 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Jan 23 18:29:53.415512 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 23 18:29:53.415523 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 23 18:29:53.415533 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Jan 23 18:29:53.415544 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 23 18:29:53.415556 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 23 18:29:53.415568 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 23 18:29:53.415579 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Jan 23 18:29:53.415590 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 23 18:29:53.420576 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Jan 23 18:29:53.420623 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Jan 23 18:29:53.420636 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Jan 23 18:29:53.420647 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Jan 23 18:29:53.420659 systemd[1]: Stopped systemd-fsck-usr.service. Jan 23 18:29:53.420670 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 23 18:29:53.420681 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 23 18:29:53.420693 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 23 18:29:53.420706 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jan 23 18:29:53.420717 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Jan 23 18:29:53.420728 kernel: fuse: init (API version 7.41) Jan 23 18:29:53.420739 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Jan 23 18:29:53.420751 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 23 18:29:53.422505 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 23 18:29:53.422518 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Jan 23 18:29:53.422533 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Jan 23 18:29:53.422546 systemd[1]: Mounted media.mount - External Media Directory. Jan 23 18:29:53.422557 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Jan 23 18:29:53.422567 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Jan 23 18:29:53.422578 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Jan 23 18:29:53.422590 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 23 18:29:53.422614 systemd[1]: modprobe@configfs.service: Deactivated successfully. Jan 23 18:29:53.422626 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Jan 23 18:29:53.422637 kernel: ACPI: bus type drm_connector registered Jan 23 18:29:53.422649 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 23 18:29:53.422659 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 23 18:29:53.422669 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 23 18:29:53.422681 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 23 18:29:53.422691 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 23 18:29:53.422702 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 23 18:29:53.422712 systemd[1]: modprobe@fuse.service: Deactivated successfully. Jan 23 18:29:53.422723 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Jan 23 18:29:53.422734 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 23 18:29:53.422744 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 23 18:29:53.422756 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 23 18:29:53.422766 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jan 23 18:29:53.422776 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Jan 23 18:29:53.422788 systemd[1]: Reached target network-pre.target - Preparation for Network. Jan 23 18:29:53.422798 systemd[1]: Listening on systemd-importd.socket - Disk Image Download Service Socket. Jan 23 18:29:53.422809 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Jan 23 18:29:53.422819 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Jan 23 18:29:53.422832 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Jan 23 18:29:53.422843 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 23 18:29:53.422854 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Jan 23 18:29:53.422866 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 23 18:29:53.422877 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Jan 23 18:29:53.422910 systemd-journald[1286]: Collecting audit messages is enabled. Jan 23 18:29:53.422936 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Jan 23 18:29:53.422947 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 23 18:29:53.422958 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Jan 23 18:29:53.422970 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 23 18:29:53.422981 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 23 18:29:53.422993 systemd-journald[1286]: Journal started Jan 23 18:29:53.423016 systemd-journald[1286]: Runtime Journal (/run/log/journal/5a102a0d3245405fa2afa4cae2c0142c) is 8M, max 77.9M, 69.9M free. Jan 23 18:29:53.265000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:53.269000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck-usr comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:53.272000 audit: BPF prog-id=14 op=UNLOAD Jan 23 18:29:53.272000 audit: BPF prog-id=13 op=UNLOAD Jan 23 18:29:53.273000 audit: BPF prog-id=15 op=LOAD Jan 23 18:29:53.273000 audit: BPF prog-id=16 op=LOAD Jan 23 18:29:53.273000 audit: BPF prog-id=17 op=LOAD Jan 23 18:29:53.347000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:53.352000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:53.352000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:53.355000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:53.355000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:53.359000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:53.359000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:53.364000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:53.364000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:53.369000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:53.369000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:53.374000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:53.374000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:53.376000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:53.379000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:53.383000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-remount-fs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:53.406000 audit: CONFIG_CHANGE op=set audit_enabled=1 old=1 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 res=1 Jan 23 18:29:53.406000 audit[1286]: SYSCALL arch=c000003e syscall=46 success=yes exit=60 a0=3 a1=7ffff2d21db0 a2=4000 a3=0 items=0 ppid=1 pid=1286 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="systemd-journal" exe="/usr/lib/systemd/systemd-journald" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:29:53.406000 audit: PROCTITLE proctitle="/usr/lib/systemd/systemd-journald" Jan 23 18:29:53.096524 systemd[1]: Queued start job for default target multi-user.target. Jan 23 18:29:53.121352 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Jan 23 18:29:53.121745 systemd[1]: systemd-journald.service: Deactivated successfully. Jan 23 18:29:53.433363 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Jan 23 18:29:53.433398 systemd[1]: Started systemd-journald.service - Journal Service. Jan 23 18:29:53.432000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:53.433989 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Jan 23 18:29:53.433000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-load-credentials comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:53.435503 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Jan 23 18:29:53.436082 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Jan 23 18:29:53.449715 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Jan 23 18:29:53.450660 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Jan 23 18:29:53.450000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-random-seed comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:53.452254 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Jan 23 18:29:53.460000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=flatcar-tmpfiles comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:53.455818 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Jan 23 18:29:53.460948 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Jan 23 18:29:53.465880 systemd[1]: Starting systemd-sysusers.service - Create System Users... Jan 23 18:29:53.484416 kernel: loop1: detected capacity change from 0 to 50784 Jan 23 18:29:53.483747 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 23 18:29:53.484000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:53.491407 systemd-journald[1286]: Time spent on flushing to /var/log/journal/5a102a0d3245405fa2afa4cae2c0142c is 26.926ms for 1855 entries. Jan 23 18:29:53.491407 systemd-journald[1286]: System Journal (/var/log/journal/5a102a0d3245405fa2afa4cae2c0142c) is 8M, max 588.1M, 580.1M free. Jan 23 18:29:53.528931 systemd-journald[1286]: Received client request to flush runtime journal. Jan 23 18:29:53.493000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:53.497000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-machine-id-commit comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:53.493227 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 23 18:29:53.497484 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Jan 23 18:29:53.532549 kernel: loop2: detected capacity change from 0 to 1656 Jan 23 18:29:53.531000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-flush comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:53.531221 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Jan 23 18:29:53.540196 systemd[1]: Finished systemd-sysusers.service - Create System Users. Jan 23 18:29:53.540000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysusers comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:53.541000 audit: BPF prog-id=18 op=LOAD Jan 23 18:29:53.542000 audit: BPF prog-id=19 op=LOAD Jan 23 18:29:53.542000 audit: BPF prog-id=20 op=LOAD Jan 23 18:29:53.543818 systemd[1]: Starting systemd-oomd.service - Userspace Out-Of-Memory (OOM) Killer... Jan 23 18:29:53.544000 audit: BPF prog-id=21 op=LOAD Jan 23 18:29:53.546730 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 23 18:29:53.548893 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 23 18:29:53.554614 kernel: loop3: detected capacity change from 0 to 111560 Jan 23 18:29:53.555000 audit: BPF prog-id=22 op=LOAD Jan 23 18:29:53.555000 audit: BPF prog-id=23 op=LOAD Jan 23 18:29:53.555000 audit: BPF prog-id=24 op=LOAD Jan 23 18:29:53.558780 systemd[1]: Starting systemd-nsresourced.service - Namespace Resource Manager... Jan 23 18:29:53.564000 audit: BPF prog-id=25 op=LOAD Jan 23 18:29:53.565000 audit: BPF prog-id=26 op=LOAD Jan 23 18:29:53.569000 audit: BPF prog-id=27 op=LOAD Jan 23 18:29:53.571752 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Jan 23 18:29:53.599007 systemd-tmpfiles[1359]: ACLs are not supported, ignoring. Jan 23 18:29:53.599020 systemd-tmpfiles[1359]: ACLs are not supported, ignoring. Jan 23 18:29:53.604459 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 23 18:29:53.605000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:53.611618 kernel: loop4: detected capacity change from 0 to 224512 Jan 23 18:29:53.623281 systemd-nsresourced[1360]: Not setting up BPF subsystem, as functionality has been disabled at compile time. Jan 23 18:29:53.624262 systemd[1]: Started systemd-nsresourced.service - Namespace Resource Manager. Jan 23 18:29:53.624000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-nsresourced comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:53.634997 systemd[1]: Started systemd-userdbd.service - User Database Manager. Jan 23 18:29:53.635000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-userdbd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:53.653621 kernel: loop5: detected capacity change from 0 to 50784 Jan 23 18:29:53.683623 kernel: loop6: detected capacity change from 0 to 1656 Jan 23 18:29:53.685739 systemd-oomd[1357]: No swap; memory pressure usage will be degraded Jan 23 18:29:53.686169 systemd[1]: Started systemd-oomd.service - Userspace Out-Of-Memory (OOM) Killer. Jan 23 18:29:53.686000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-oomd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:53.695630 kernel: loop7: detected capacity change from 0 to 111560 Jan 23 18:29:53.713640 kernel: loop1: detected capacity change from 0 to 224512 Jan 23 18:29:53.720673 systemd-resolved[1358]: Positive Trust Anchors: Jan 23 18:29:53.720683 systemd-resolved[1358]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 23 18:29:53.720687 systemd-resolved[1358]: . IN DS 38696 8 2 683d2d0acb8c9b712a1948b27f741219298d0a450d612c483af444a4c0fb2b16 Jan 23 18:29:53.720715 systemd-resolved[1358]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 23 18:29:53.747862 systemd-resolved[1358]: Using system hostname 'ci-4547-1-0-2-32611d5cc2'. Jan 23 18:29:53.749371 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 23 18:29:53.749000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:53.750394 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 23 18:29:53.751125 (sd-merge)[1379]: Using extensions 'containerd-flatcar.raw', 'docker-flatcar.raw', 'kubernetes.raw', 'oem-stackit.raw'. Jan 23 18:29:53.754838 (sd-merge)[1379]: Merged extensions into '/usr'. Jan 23 18:29:53.761723 systemd[1]: Reload requested from client PID 1317 ('systemd-sysext') (unit systemd-sysext.service)... Jan 23 18:29:53.761738 systemd[1]: Reloading... Jan 23 18:29:53.821622 zram_generator::config[1408]: No configuration found. Jan 23 18:29:53.995798 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Jan 23 18:29:53.996196 systemd[1]: Reloading finished in 234 ms. Jan 23 18:29:54.015872 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Jan 23 18:29:54.015000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-hwdb-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:54.016683 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Jan 23 18:29:54.016000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysext comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:54.024693 systemd[1]: Starting ensure-sysext.service... Jan 23 18:29:54.027711 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 23 18:29:54.027000 audit: BPF prog-id=8 op=UNLOAD Jan 23 18:29:54.027000 audit: BPF prog-id=7 op=UNLOAD Jan 23 18:29:54.028000 audit: BPF prog-id=28 op=LOAD Jan 23 18:29:54.028000 audit: BPF prog-id=29 op=LOAD Jan 23 18:29:54.029748 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 23 18:29:54.030000 audit: BPF prog-id=30 op=LOAD Jan 23 18:29:54.030000 audit: BPF prog-id=25 op=UNLOAD Jan 23 18:29:54.030000 audit: BPF prog-id=31 op=LOAD Jan 23 18:29:54.030000 audit: BPF prog-id=32 op=LOAD Jan 23 18:29:54.030000 audit: BPF prog-id=26 op=UNLOAD Jan 23 18:29:54.030000 audit: BPF prog-id=27 op=UNLOAD Jan 23 18:29:54.030000 audit: BPF prog-id=33 op=LOAD Jan 23 18:29:54.030000 audit: BPF prog-id=15 op=UNLOAD Jan 23 18:29:54.030000 audit: BPF prog-id=34 op=LOAD Jan 23 18:29:54.030000 audit: BPF prog-id=35 op=LOAD Jan 23 18:29:54.030000 audit: BPF prog-id=16 op=UNLOAD Jan 23 18:29:54.030000 audit: BPF prog-id=17 op=UNLOAD Jan 23 18:29:54.032000 audit: BPF prog-id=36 op=LOAD Jan 23 18:29:54.032000 audit: BPF prog-id=18 op=UNLOAD Jan 23 18:29:54.032000 audit: BPF prog-id=37 op=LOAD Jan 23 18:29:54.032000 audit: BPF prog-id=38 op=LOAD Jan 23 18:29:54.032000 audit: BPF prog-id=19 op=UNLOAD Jan 23 18:29:54.032000 audit: BPF prog-id=20 op=UNLOAD Jan 23 18:29:54.033000 audit: BPF prog-id=39 op=LOAD Jan 23 18:29:54.033000 audit: BPF prog-id=21 op=UNLOAD Jan 23 18:29:54.034000 audit: BPF prog-id=40 op=LOAD Jan 23 18:29:54.034000 audit: BPF prog-id=22 op=UNLOAD Jan 23 18:29:54.034000 audit: BPF prog-id=41 op=LOAD Jan 23 18:29:54.035000 audit: BPF prog-id=42 op=LOAD Jan 23 18:29:54.035000 audit: BPF prog-id=23 op=UNLOAD Jan 23 18:29:54.035000 audit: BPF prog-id=24 op=UNLOAD Jan 23 18:29:54.047714 systemd[1]: Reload requested from client PID 1454 ('systemctl') (unit ensure-sysext.service)... Jan 23 18:29:54.047729 systemd[1]: Reloading... Jan 23 18:29:54.063235 systemd-udevd[1456]: Using default interface naming scheme 'v257'. Jan 23 18:29:54.066324 systemd-tmpfiles[1455]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Jan 23 18:29:54.066558 systemd-tmpfiles[1455]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Jan 23 18:29:54.066825 systemd-tmpfiles[1455]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Jan 23 18:29:54.068234 systemd-tmpfiles[1455]: ACLs are not supported, ignoring. Jan 23 18:29:54.068360 systemd-tmpfiles[1455]: ACLs are not supported, ignoring. Jan 23 18:29:54.077767 systemd-tmpfiles[1455]: Detected autofs mount point /boot during canonicalization of boot. Jan 23 18:29:54.077976 systemd-tmpfiles[1455]: Skipping /boot Jan 23 18:29:54.087902 systemd-tmpfiles[1455]: Detected autofs mount point /boot during canonicalization of boot. Jan 23 18:29:54.088004 systemd-tmpfiles[1455]: Skipping /boot Jan 23 18:29:54.124622 zram_generator::config[1506]: No configuration found. Jan 23 18:29:54.236656 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input5 Jan 23 18:29:54.241619 kernel: mousedev: PS/2 mouse device common for all mice Jan 23 18:29:54.246616 kernel: ACPI: button: Power Button [PWRF] Jan 23 18:29:54.334197 kernel: i801_smbus 0000:00:1f.3: Enabling SMBus device Jan 23 18:29:54.334446 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Jan 23 18:29:54.334563 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Jan 23 18:29:54.400697 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Jan 23 18:29:54.401308 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Jan 23 18:29:54.402681 systemd[1]: Reloading finished in 354 ms. Jan 23 18:29:54.409533 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 23 18:29:54.411000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:54.412992 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 23 18:29:54.413000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:54.416000 audit: BPF prog-id=43 op=LOAD Jan 23 18:29:54.416000 audit: BPF prog-id=44 op=LOAD Jan 23 18:29:54.416000 audit: BPF prog-id=28 op=UNLOAD Jan 23 18:29:54.416000 audit: BPF prog-id=29 op=UNLOAD Jan 23 18:29:54.416000 audit: BPF prog-id=45 op=LOAD Jan 23 18:29:54.416000 audit: BPF prog-id=36 op=UNLOAD Jan 23 18:29:54.416000 audit: BPF prog-id=46 op=LOAD Jan 23 18:29:54.416000 audit: BPF prog-id=47 op=LOAD Jan 23 18:29:54.416000 audit: BPF prog-id=37 op=UNLOAD Jan 23 18:29:54.416000 audit: BPF prog-id=38 op=UNLOAD Jan 23 18:29:54.418000 audit: BPF prog-id=48 op=LOAD Jan 23 18:29:54.418000 audit: BPF prog-id=39 op=UNLOAD Jan 23 18:29:54.420000 audit: BPF prog-id=49 op=LOAD Jan 23 18:29:54.420000 audit: BPF prog-id=33 op=UNLOAD Jan 23 18:29:54.420000 audit: BPF prog-id=50 op=LOAD Jan 23 18:29:54.420000 audit: BPF prog-id=51 op=LOAD Jan 23 18:29:54.420000 audit: BPF prog-id=34 op=UNLOAD Jan 23 18:29:54.420000 audit: BPF prog-id=35 op=UNLOAD Jan 23 18:29:54.421000 audit: BPF prog-id=52 op=LOAD Jan 23 18:29:54.421000 audit: BPF prog-id=30 op=UNLOAD Jan 23 18:29:54.421000 audit: BPF prog-id=53 op=LOAD Jan 23 18:29:54.421000 audit: BPF prog-id=54 op=LOAD Jan 23 18:29:54.421000 audit: BPF prog-id=31 op=UNLOAD Jan 23 18:29:54.421000 audit: BPF prog-id=32 op=UNLOAD Jan 23 18:29:54.421000 audit: BPF prog-id=55 op=LOAD Jan 23 18:29:54.421000 audit: BPF prog-id=40 op=UNLOAD Jan 23 18:29:54.421000 audit: BPF prog-id=56 op=LOAD Jan 23 18:29:54.421000 audit: BPF prog-id=57 op=LOAD Jan 23 18:29:54.421000 audit: BPF prog-id=41 op=UNLOAD Jan 23 18:29:54.421000 audit: BPF prog-id=42 op=UNLOAD Jan 23 18:29:54.483948 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 23 18:29:54.486095 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 23 18:29:54.488835 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Jan 23 18:29:54.489412 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 23 18:29:54.493450 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 23 18:29:54.495815 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 23 18:29:54.496910 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 23 18:29:54.497819 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 23 18:29:54.498006 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Jan 23 18:29:54.501472 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Jan 23 18:29:54.505827 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Jan 23 18:29:54.506725 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 23 18:29:54.509591 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Jan 23 18:29:54.510000 audit: BPF prog-id=58 op=LOAD Jan 23 18:29:54.515284 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 23 18:29:54.519830 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Jan 23 18:29:54.524618 kernel: [drm] pci: virtio-vga detected at 0000:00:01.0 Jan 23 18:29:54.525914 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 23 18:29:54.526679 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 23 18:29:54.527614 kernel: Console: switching to colour dummy device 80x25 Jan 23 18:29:54.530612 kernel: virtio-pci 0000:00:01.0: vgaarb: deactivate vga console Jan 23 18:29:54.534697 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Jan 23 18:29:54.534730 kernel: [drm] features: -context_init Jan 23 18:29:54.535136 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 23 18:29:54.535340 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 23 18:29:54.550617 kernel: [drm] number of scanouts: 1 Jan 23 18:29:54.556650 kernel: [drm] number of cap sets: 0 Jan 23 18:29:54.559880 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:01.0 on minor 0 Jan 23 18:29:54.558007 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 23 18:29:54.577238 systemd[1]: Starting modprobe@ptp_kvm.service - Load Kernel Module ptp_kvm... Jan 23 18:29:54.577769 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 23 18:29:54.577940 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Jan 23 18:29:54.578027 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 23 18:29:54.578173 systemd[1]: Reached target time-set.target - System Time Set. Jan 23 18:29:54.580116 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 23 18:29:54.584671 systemd[1]: Finished ensure-sysext.service. Jan 23 18:29:54.584000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=ensure-sysext comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:54.594269 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Jan 23 18:29:54.593000 audit[1584]: SYSTEM_BOOT pid=1584 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg=' comm="systemd-update-utmp" exe="/usr/lib/systemd/systemd-update-utmp" hostname=? addr=? terminal=? res=success' Jan 23 18:29:54.594000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck@dev-disk-by\x2dlabel-OEM comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:54.598484 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 23 18:29:54.602384 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 23 18:29:54.601000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:54.601000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:54.603458 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 23 18:29:54.608391 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 23 18:29:54.607000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:54.607000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:54.608000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:54.608000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:54.609000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:54.609000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:54.608803 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 23 18:29:54.609522 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 23 18:29:54.609887 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 23 18:29:54.610029 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 23 18:29:54.616967 kernel: pps_core: LinuxPPS API ver. 1 registered Jan 23 18:29:54.617011 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Jan 23 18:29:54.622129 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 23 18:29:54.622310 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 23 18:29:54.624000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-update-utmp comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:54.624741 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Jan 23 18:29:54.626635 kernel: PTP clock support registered Jan 23 18:29:54.633811 systemd[1]: modprobe@ptp_kvm.service: Deactivated successfully. Jan 23 18:29:54.634000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@ptp_kvm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:54.634000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@ptp_kvm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:54.634669 systemd[1]: Finished modprobe@ptp_kvm.service - Load Kernel Module ptp_kvm. Jan 23 18:29:54.644647 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device Jan 23 18:29:54.646093 kernel: Console: switching to colour frame buffer device 160x50 Jan 23 18:29:54.668292 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Jan 23 18:29:54.671787 kernel: virtio-pci 0000:00:01.0: [drm] fb0: virtio_gpudrmfb frame buffer device Jan 23 18:29:54.675000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-catalog-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:54.694892 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 23 18:29:54.695091 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 23 18:29:54.694000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:54.694000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:29:54.703773 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 23 18:29:54.704000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=add_rule key=(null) list=5 res=1 Jan 23 18:29:54.704000 audit[1623]: SYSCALL arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7ffc23ea4840 a2=420 a3=0 items=0 ppid=1575 pid=1623 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:29:54.704000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 23 18:29:54.706467 augenrules[1623]: No rules Jan 23 18:29:54.708007 systemd[1]: audit-rules.service: Deactivated successfully. Jan 23 18:29:54.708213 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 23 18:29:54.719474 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Jan 23 18:29:54.719981 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Jan 23 18:29:54.734378 systemd-networkd[1583]: lo: Link UP Jan 23 18:29:54.734385 systemd-networkd[1583]: lo: Gained carrier Jan 23 18:29:54.735447 systemd-networkd[1583]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 23 18:29:54.735453 systemd-networkd[1583]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 23 18:29:54.735752 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 23 18:29:54.735894 systemd[1]: Reached target network.target - Network. Jan 23 18:29:54.737535 systemd-networkd[1583]: eth0: Link UP Jan 23 18:29:54.737681 systemd-networkd[1583]: eth0: Gained carrier Jan 23 18:29:54.737693 systemd-networkd[1583]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 23 18:29:54.738490 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Jan 23 18:29:54.744048 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Jan 23 18:29:54.756676 systemd-networkd[1583]: eth0: DHCPv4 address 10.0.9.101/25, gateway 10.0.9.1 acquired from 10.0.9.1 Jan 23 18:29:54.766587 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 23 18:29:54.775649 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Jan 23 18:29:55.155266 ldconfig[1580]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Jan 23 18:29:55.159474 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Jan 23 18:29:55.162457 systemd[1]: Starting systemd-update-done.service - Update is Completed... Jan 23 18:29:55.186133 systemd[1]: Finished systemd-update-done.service - Update is Completed. Jan 23 18:29:55.188223 systemd[1]: Reached target sysinit.target - System Initialization. Jan 23 18:29:55.188383 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Jan 23 18:29:55.188461 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Jan 23 18:29:55.188524 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Jan 23 18:29:55.188861 systemd[1]: Started logrotate.timer - Daily rotation of log files. Jan 23 18:29:55.189424 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Jan 23 18:29:55.189512 systemd[1]: Started systemd-sysupdate-reboot.timer - Reboot Automatically After System Update. Jan 23 18:29:55.189620 systemd[1]: Started systemd-sysupdate.timer - Automatic System Update. Jan 23 18:29:55.189672 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Jan 23 18:29:55.190120 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Jan 23 18:29:55.190142 systemd[1]: Reached target paths.target - Path Units. Jan 23 18:29:55.190534 systemd[1]: Reached target timers.target - Timer Units. Jan 23 18:29:55.194648 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Jan 23 18:29:55.196157 systemd[1]: Starting docker.socket - Docker Socket for the API... Jan 23 18:29:55.199040 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Jan 23 18:29:55.209432 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Jan 23 18:29:55.209956 systemd[1]: Reached target ssh-access.target - SSH Access Available. Jan 23 18:29:55.221304 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Jan 23 18:29:55.222263 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Jan 23 18:29:55.223455 systemd[1]: Listening on docker.socket - Docker Socket for the API. Jan 23 18:29:55.227297 systemd[1]: Reached target sockets.target - Socket Units. Jan 23 18:29:55.229487 systemd[1]: Reached target basic.target - Basic System. Jan 23 18:29:55.230008 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Jan 23 18:29:55.230041 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Jan 23 18:29:55.231939 systemd[1]: Starting chronyd.service - NTP client/server... Jan 23 18:29:55.235695 systemd[1]: Starting containerd.service - containerd container runtime... Jan 23 18:29:55.242969 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Jan 23 18:29:55.244717 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Jan 23 18:29:55.249767 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Jan 23 18:29:55.254695 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Jan 23 18:29:55.260724 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Jan 23 18:29:55.261200 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Jan 23 18:29:55.264087 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Jan 23 18:29:55.270616 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 23 18:29:55.270722 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Jan 23 18:29:55.272587 jq[1650]: false Jan 23 18:29:55.273679 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Jan 23 18:29:55.282173 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Jan 23 18:29:55.286006 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Jan 23 18:29:55.291410 systemd[1]: Starting systemd-logind.service - User Login Management... Jan 23 18:29:55.294323 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Jan 23 18:29:55.298863 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Jan 23 18:29:55.303023 systemd[1]: Starting update-engine.service - Update Engine... Jan 23 18:29:55.307924 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Jan 23 18:29:55.311188 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Jan 23 18:29:55.313145 chronyd[1645]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER -SIGND +NTS +SECHASH +IPV6 -DEBUG) Jan 23 18:29:55.313420 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Jan 23 18:29:55.313632 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Jan 23 18:29:55.318952 chronyd[1645]: Loaded seccomp filter (level 2) Jan 23 18:29:55.319073 systemd[1]: Started chronyd.service - NTP client/server. Jan 23 18:29:55.326629 google_oslogin_nss_cache[1652]: oslogin_cache_refresh[1652]: Refreshing passwd entry cache Jan 23 18:29:55.326638 oslogin_cache_refresh[1652]: Refreshing passwd entry cache Jan 23 18:29:55.335393 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Jan 23 18:29:55.336089 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Jan 23 18:29:55.337009 extend-filesystems[1651]: Found /dev/vda6 Jan 23 18:29:55.352059 google_oslogin_nss_cache[1652]: oslogin_cache_refresh[1652]: Failure getting users, quitting Jan 23 18:29:55.352056 oslogin_cache_refresh[1652]: Failure getting users, quitting Jan 23 18:29:55.352150 google_oslogin_nss_cache[1652]: oslogin_cache_refresh[1652]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Jan 23 18:29:55.352150 google_oslogin_nss_cache[1652]: oslogin_cache_refresh[1652]: Refreshing group entry cache Jan 23 18:29:55.352072 oslogin_cache_refresh[1652]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Jan 23 18:29:55.352105 oslogin_cache_refresh[1652]: Refreshing group entry cache Jan 23 18:29:55.353527 extend-filesystems[1651]: Found /dev/vda9 Jan 23 18:29:55.361927 jq[1663]: true Jan 23 18:29:55.361781 oslogin_cache_refresh[1652]: Failure getting groups, quitting Jan 23 18:29:55.362082 google_oslogin_nss_cache[1652]: oslogin_cache_refresh[1652]: Failure getting groups, quitting Jan 23 18:29:55.362082 google_oslogin_nss_cache[1652]: oslogin_cache_refresh[1652]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Jan 23 18:29:55.361792 oslogin_cache_refresh[1652]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Jan 23 18:29:55.363843 extend-filesystems[1651]: Checking size of /dev/vda9 Jan 23 18:29:55.363966 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Jan 23 18:29:55.364200 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Jan 23 18:29:55.379656 systemd[1]: motdgen.service: Deactivated successfully. Jan 23 18:29:55.380484 tar[1666]: linux-amd64/LICENSE Jan 23 18:29:55.380484 tar[1666]: linux-amd64/helm Jan 23 18:29:55.380814 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Jan 23 18:29:55.384398 update_engine[1661]: I20260123 18:29:55.384323 1661 main.cc:92] Flatcar Update Engine starting Jan 23 18:29:55.390092 dbus-daemon[1648]: [system] SELinux support is enabled Jan 23 18:29:55.390955 systemd[1]: Started dbus.service - D-Bus System Message Bus. Jan 23 18:29:55.395740 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Jan 23 18:29:55.395766 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Jan 23 18:29:55.397322 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Jan 23 18:29:55.397341 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Jan 23 18:29:55.407563 jq[1693]: true Jan 23 18:29:55.414512 extend-filesystems[1651]: Resized partition /dev/vda9 Jan 23 18:29:55.415704 systemd[1]: Started update-engine.service - Update Engine. Jan 23 18:29:55.416280 update_engine[1661]: I20260123 18:29:55.416055 1661 update_check_scheduler.cc:74] Next update check in 4m34s Jan 23 18:29:55.424772 extend-filesystems[1704]: resize2fs 1.47.3 (8-Jul-2025) Jan 23 18:29:55.428909 kernel: EXT4-fs (vda9): resizing filesystem from 1617920 to 11516923 blocks Jan 23 18:29:55.432418 systemd[1]: Started locksmithd.service - Cluster reboot manager. Jan 23 18:29:55.487529 systemd-logind[1659]: New seat seat0. Jan 23 18:29:55.495956 systemd-logind[1659]: Watching system buttons on /dev/input/event3 (Power Button) Jan 23 18:29:55.495977 systemd-logind[1659]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Jan 23 18:29:55.497058 systemd[1]: Started systemd-logind.service - User Login Management. Jan 23 18:29:55.546049 bash[1720]: Updated "/home/core/.ssh/authorized_keys" Jan 23 18:29:55.545475 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Jan 23 18:29:55.550241 systemd[1]: Starting sshkeys.service... Jan 23 18:29:55.576566 sshd_keygen[1678]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Jan 23 18:29:55.628391 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Jan 23 18:29:55.633713 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Jan 23 18:29:55.665807 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 23 18:29:55.672254 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Jan 23 18:29:55.680094 systemd[1]: Starting issuegen.service - Generate /run/issue... Jan 23 18:29:55.703396 locksmithd[1705]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Jan 23 18:29:55.705338 systemd[1]: issuegen.service: Deactivated successfully. Jan 23 18:29:55.706351 systemd[1]: Finished issuegen.service - Generate /run/issue. Jan 23 18:29:55.710840 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Jan 23 18:29:55.717568 containerd[1690]: time="2026-01-23T18:29:55Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Jan 23 18:29:55.718960 containerd[1690]: time="2026-01-23T18:29:55.718775688Z" level=info msg="starting containerd" revision=fcd43222d6b07379a4be9786bda52438f0dd16a1 version=v2.1.5 Jan 23 18:29:55.729033 containerd[1690]: time="2026-01-23T18:29:55.728567159Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="7.392µs" Jan 23 18:29:55.729033 containerd[1690]: time="2026-01-23T18:29:55.728596631Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Jan 23 18:29:55.729033 containerd[1690]: time="2026-01-23T18:29:55.728667103Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Jan 23 18:29:55.729033 containerd[1690]: time="2026-01-23T18:29:55.728680615Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Jan 23 18:29:55.729033 containerd[1690]: time="2026-01-23T18:29:55.728776317Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Jan 23 18:29:55.729033 containerd[1690]: time="2026-01-23T18:29:55.728788276Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jan 23 18:29:55.729033 containerd[1690]: time="2026-01-23T18:29:55.728826666Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jan 23 18:29:55.729033 containerd[1690]: time="2026-01-23T18:29:55.728838271Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jan 23 18:29:55.729033 containerd[1690]: time="2026-01-23T18:29:55.728982621Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jan 23 18:29:55.729033 containerd[1690]: time="2026-01-23T18:29:55.728995864Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jan 23 18:29:55.729033 containerd[1690]: time="2026-01-23T18:29:55.729006876Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jan 23 18:29:55.729033 containerd[1690]: time="2026-01-23T18:29:55.729015428Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.erofs type=io.containerd.snapshotter.v1 Jan 23 18:29:55.729232 containerd[1690]: time="2026-01-23T18:29:55.729118693Z" level=info msg="skip loading plugin" error="EROFS unsupported, please `modprobe erofs`: skip plugin" id=io.containerd.snapshotter.v1.erofs type=io.containerd.snapshotter.v1 Jan 23 18:29:55.729232 containerd[1690]: time="2026-01-23T18:29:55.729131245Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Jan 23 18:29:55.729232 containerd[1690]: time="2026-01-23T18:29:55.729178390Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Jan 23 18:29:55.729527 containerd[1690]: time="2026-01-23T18:29:55.729302129Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jan 23 18:29:55.729527 containerd[1690]: time="2026-01-23T18:29:55.729330917Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jan 23 18:29:55.729527 containerd[1690]: time="2026-01-23T18:29:55.729342265Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Jan 23 18:29:55.729527 containerd[1690]: time="2026-01-23T18:29:55.729360895Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Jan 23 18:29:55.730966 kernel: EXT4-fs (vda9): resized filesystem to 11516923 Jan 23 18:29:55.731003 containerd[1690]: time="2026-01-23T18:29:55.729704559Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Jan 23 18:29:55.731003 containerd[1690]: time="2026-01-23T18:29:55.729759455Z" level=info msg="metadata content store policy set" policy=shared Jan 23 18:29:55.734245 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Jan 23 18:29:55.739398 systemd[1]: Started getty@tty1.service - Getty on tty1. Jan 23 18:29:55.742294 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Jan 23 18:29:55.743145 systemd[1]: Reached target getty.target - Login Prompts. Jan 23 18:29:55.750730 containerd[1690]: time="2026-01-23T18:29:55.750698430Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Jan 23 18:29:55.751487 containerd[1690]: time="2026-01-23T18:29:55.750804899Z" level=info msg="loading plugin" id=io.containerd.differ.v1.erofs type=io.containerd.differ.v1 Jan 23 18:29:55.751487 containerd[1690]: time="2026-01-23T18:29:55.750872303Z" level=info msg="skip loading plugin" error="could not find mkfs.erofs: exec: \"mkfs.erofs\": executable file not found in $PATH: skip plugin" id=io.containerd.differ.v1.erofs type=io.containerd.differ.v1 Jan 23 18:29:55.751487 containerd[1690]: time="2026-01-23T18:29:55.750882327Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Jan 23 18:29:55.751487 containerd[1690]: time="2026-01-23T18:29:55.750899631Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Jan 23 18:29:55.751487 containerd[1690]: time="2026-01-23T18:29:55.750912193Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Jan 23 18:29:55.751487 containerd[1690]: time="2026-01-23T18:29:55.750922234Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Jan 23 18:29:55.751487 containerd[1690]: time="2026-01-23T18:29:55.750938627Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Jan 23 18:29:55.751487 containerd[1690]: time="2026-01-23T18:29:55.750949067Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Jan 23 18:29:55.751487 containerd[1690]: time="2026-01-23T18:29:55.750965172Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Jan 23 18:29:55.751487 containerd[1690]: time="2026-01-23T18:29:55.750974749Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Jan 23 18:29:55.751487 containerd[1690]: time="2026-01-23T18:29:55.750983236Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Jan 23 18:29:55.751487 containerd[1690]: time="2026-01-23T18:29:55.750990380Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Jan 23 18:29:55.751487 containerd[1690]: time="2026-01-23T18:29:55.751002098Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Jan 23 18:29:55.751716 extend-filesystems[1704]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Jan 23 18:29:55.751716 extend-filesystems[1704]: old_desc_blocks = 1, new_desc_blocks = 6 Jan 23 18:29:55.751716 extend-filesystems[1704]: The filesystem on /dev/vda9 is now 11516923 (4k) blocks long. Jan 23 18:29:55.756329 containerd[1690]: time="2026-01-23T18:29:55.751097061Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Jan 23 18:29:55.756329 containerd[1690]: time="2026-01-23T18:29:55.751112849Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Jan 23 18:29:55.756329 containerd[1690]: time="2026-01-23T18:29:55.751129986Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Jan 23 18:29:55.756329 containerd[1690]: time="2026-01-23T18:29:55.751139584Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Jan 23 18:29:55.756329 containerd[1690]: time="2026-01-23T18:29:55.751147963Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Jan 23 18:29:55.756329 containerd[1690]: time="2026-01-23T18:29:55.751155952Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Jan 23 18:29:55.756329 containerd[1690]: time="2026-01-23T18:29:55.751165071Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Jan 23 18:29:55.756329 containerd[1690]: time="2026-01-23T18:29:55.751172081Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Jan 23 18:29:55.756329 containerd[1690]: time="2026-01-23T18:29:55.751180794Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Jan 23 18:29:55.756329 containerd[1690]: time="2026-01-23T18:29:55.753496417Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Jan 23 18:29:55.756329 containerd[1690]: time="2026-01-23T18:29:55.753522958Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Jan 23 18:29:55.756329 containerd[1690]: time="2026-01-23T18:29:55.753624461Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Jan 23 18:29:55.756329 containerd[1690]: time="2026-01-23T18:29:55.753670963Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Jan 23 18:29:55.756329 containerd[1690]: time="2026-01-23T18:29:55.753681481Z" level=info msg="Start snapshots syncer" Jan 23 18:29:55.756329 containerd[1690]: time="2026-01-23T18:29:55.753737986Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Jan 23 18:29:55.752248 systemd[1]: extend-filesystems.service: Deactivated successfully. Jan 23 18:29:55.757289 extend-filesystems[1651]: Resized filesystem in /dev/vda9 Jan 23 18:29:55.759505 containerd[1690]: time="2026-01-23T18:29:55.754056699Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"cgroupWritable\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"\",\"binDirs\":[\"/opt/cni/bin\"],\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogLineSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Jan 23 18:29:55.759505 containerd[1690]: time="2026-01-23T18:29:55.754098451Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Jan 23 18:29:55.753244 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Jan 23 18:29:55.759886 containerd[1690]: time="2026-01-23T18:29:55.754144189Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Jan 23 18:29:55.759886 containerd[1690]: time="2026-01-23T18:29:55.754254244Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Jan 23 18:29:55.759886 containerd[1690]: time="2026-01-23T18:29:55.754288138Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Jan 23 18:29:55.759886 containerd[1690]: time="2026-01-23T18:29:55.754304856Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Jan 23 18:29:55.759886 containerd[1690]: time="2026-01-23T18:29:55.754313812Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Jan 23 18:29:55.759886 containerd[1690]: time="2026-01-23T18:29:55.754323468Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Jan 23 18:29:55.759886 containerd[1690]: time="2026-01-23T18:29:55.754332110Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Jan 23 18:29:55.759886 containerd[1690]: time="2026-01-23T18:29:55.754340084Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Jan 23 18:29:55.759886 containerd[1690]: time="2026-01-23T18:29:55.754348481Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Jan 23 18:29:55.759886 containerd[1690]: time="2026-01-23T18:29:55.754356358Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Jan 23 18:29:55.759886 containerd[1690]: time="2026-01-23T18:29:55.756690933Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jan 23 18:29:55.759886 containerd[1690]: time="2026-01-23T18:29:55.756715730Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jan 23 18:29:55.759886 containerd[1690]: time="2026-01-23T18:29:55.756724148Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jan 23 18:29:55.760098 containerd[1690]: time="2026-01-23T18:29:55.756791569Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jan 23 18:29:55.760098 containerd[1690]: time="2026-01-23T18:29:55.756799304Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Jan 23 18:29:55.760098 containerd[1690]: time="2026-01-23T18:29:55.756808055Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Jan 23 18:29:55.760098 containerd[1690]: time="2026-01-23T18:29:55.756816664Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Jan 23 18:29:55.760098 containerd[1690]: time="2026-01-23T18:29:55.756830539Z" level=info msg="runtime interface created" Jan 23 18:29:55.760098 containerd[1690]: time="2026-01-23T18:29:55.756834813Z" level=info msg="created NRI interface" Jan 23 18:29:55.760098 containerd[1690]: time="2026-01-23T18:29:55.756841945Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Jan 23 18:29:55.760098 containerd[1690]: time="2026-01-23T18:29:55.756852488Z" level=info msg="Connect containerd service" Jan 23 18:29:55.760098 containerd[1690]: time="2026-01-23T18:29:55.756881470Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Jan 23 18:29:55.760098 containerd[1690]: time="2026-01-23T18:29:55.758891318Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jan 23 18:29:55.850636 containerd[1690]: time="2026-01-23T18:29:55.849627069Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Jan 23 18:29:55.850636 containerd[1690]: time="2026-01-23T18:29:55.849693107Z" level=info msg=serving... address=/run/containerd/containerd.sock Jan 23 18:29:55.850636 containerd[1690]: time="2026-01-23T18:29:55.849714213Z" level=info msg="Start subscribing containerd event" Jan 23 18:29:55.850636 containerd[1690]: time="2026-01-23T18:29:55.849733610Z" level=info msg="Start recovering state" Jan 23 18:29:55.850636 containerd[1690]: time="2026-01-23T18:29:55.849816005Z" level=info msg="Start event monitor" Jan 23 18:29:55.850636 containerd[1690]: time="2026-01-23T18:29:55.849825624Z" level=info msg="Start cni network conf syncer for default" Jan 23 18:29:55.850636 containerd[1690]: time="2026-01-23T18:29:55.849833864Z" level=info msg="Start streaming server" Jan 23 18:29:55.850636 containerd[1690]: time="2026-01-23T18:29:55.849840210Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Jan 23 18:29:55.850636 containerd[1690]: time="2026-01-23T18:29:55.849845936Z" level=info msg="runtime interface starting up..." Jan 23 18:29:55.850636 containerd[1690]: time="2026-01-23T18:29:55.849850472Z" level=info msg="starting plugins..." Jan 23 18:29:55.850636 containerd[1690]: time="2026-01-23T18:29:55.849861300Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Jan 23 18:29:55.850636 containerd[1690]: time="2026-01-23T18:29:55.849945935Z" level=info msg="containerd successfully booted in 0.132722s" Jan 23 18:29:55.850113 systemd[1]: Started containerd.service - containerd container runtime. Jan 23 18:29:55.885801 systemd-networkd[1583]: eth0: Gained IPv6LL Jan 23 18:29:55.887618 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Jan 23 18:29:55.890172 systemd[1]: Reached target network-online.target - Network is Online. Jan 23 18:29:55.894841 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 23 18:29:55.896469 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Jan 23 18:29:55.928390 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Jan 23 18:29:55.967339 tar[1666]: linux-amd64/README.md Jan 23 18:29:55.986628 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Jan 23 18:29:56.288699 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 23 18:29:56.689632 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 23 18:29:56.842634 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 23 18:29:56.854983 (kubelet)[1790]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 23 18:29:57.275456 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Jan 23 18:29:57.277830 systemd[1]: Started sshd@0-10.0.9.101:22-68.220.241.50:44042.service - OpenSSH per-connection server daemon (68.220.241.50:44042). Jan 23 18:29:57.426825 kubelet[1790]: E0123 18:29:57.426763 1790 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 23 18:29:57.429829 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 23 18:29:57.429962 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 23 18:29:57.431695 systemd[1]: kubelet.service: Consumed 949ms CPU time, 265.4M memory peak. Jan 23 18:29:57.828999 sshd[1796]: Accepted publickey for core from 68.220.241.50 port 44042 ssh2: RSA SHA256:nKFqiC1tfI89CurykR2N2Qujx/39ZzIdQ7HqDt8w/Gw Jan 23 18:29:57.830050 sshd-session[1796]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 18:29:57.841288 systemd-logind[1659]: New session 1 of user core. Jan 23 18:29:57.842920 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Jan 23 18:29:57.846685 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Jan 23 18:29:57.877477 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Jan 23 18:29:57.881180 systemd[1]: Starting user@500.service - User Manager for UID 500... Jan 23 18:29:57.896362 (systemd)[1806]: pam_unix(systemd-user:session): session opened for user core(uid=500) by core(uid=0) Jan 23 18:29:57.899318 systemd-logind[1659]: New session 2 of user core. Jan 23 18:29:58.001351 systemd[1806]: Queued start job for default target default.target. Jan 23 18:29:58.008418 systemd[1806]: Created slice app.slice - User Application Slice. Jan 23 18:29:58.008546 systemd[1806]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of User's Temporary Directories. Jan 23 18:29:58.008596 systemd[1806]: Reached target paths.target - Paths. Jan 23 18:29:58.008691 systemd[1806]: Reached target timers.target - Timers. Jan 23 18:29:58.009918 systemd[1806]: Starting dbus.socket - D-Bus User Message Bus Socket... Jan 23 18:29:58.010711 systemd[1806]: Starting systemd-tmpfiles-setup.service - Create User Files and Directories... Jan 23 18:29:58.023032 systemd[1806]: Listening on dbus.socket - D-Bus User Message Bus Socket. Jan 23 18:29:58.023212 systemd[1806]: Reached target sockets.target - Sockets. Jan 23 18:29:58.025315 systemd[1806]: Finished systemd-tmpfiles-setup.service - Create User Files and Directories. Jan 23 18:29:58.025413 systemd[1806]: Reached target basic.target - Basic System. Jan 23 18:29:58.025462 systemd[1806]: Reached target default.target - Main User Target. Jan 23 18:29:58.025489 systemd[1806]: Startup finished in 121ms. Jan 23 18:29:58.025599 systemd[1]: Started user@500.service - User Manager for UID 500. Jan 23 18:29:58.038017 systemd[1]: Started session-1.scope - Session 1 of User core. Jan 23 18:29:58.299648 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 23 18:29:58.339428 systemd[1]: Started sshd@1-10.0.9.101:22-68.220.241.50:44056.service - OpenSSH per-connection server daemon (68.220.241.50:44056). Jan 23 18:29:58.701633 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 23 18:29:58.869641 sshd[1821]: Accepted publickey for core from 68.220.241.50 port 44056 ssh2: RSA SHA256:nKFqiC1tfI89CurykR2N2Qujx/39ZzIdQ7HqDt8w/Gw Jan 23 18:29:58.870777 sshd-session[1821]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 18:29:58.874536 systemd-logind[1659]: New session 3 of user core. Jan 23 18:29:58.881864 systemd[1]: Started session-3.scope - Session 3 of User core. Jan 23 18:29:59.160510 sshd[1826]: Connection closed by 68.220.241.50 port 44056 Jan 23 18:29:59.159182 sshd-session[1821]: pam_unix(sshd:session): session closed for user core Jan 23 18:29:59.162773 systemd[1]: sshd@1-10.0.9.101:22-68.220.241.50:44056.service: Deactivated successfully. Jan 23 18:29:59.164993 systemd-logind[1659]: Session 3 logged out. Waiting for processes to exit. Jan 23 18:29:59.165819 systemd[1]: session-3.scope: Deactivated successfully. Jan 23 18:29:59.167135 systemd-logind[1659]: Removed session 3. Jan 23 18:29:59.268471 systemd[1]: Started sshd@2-10.0.9.101:22-68.220.241.50:44062.service - OpenSSH per-connection server daemon (68.220.241.50:44062). Jan 23 18:29:59.785639 sshd[1832]: Accepted publickey for core from 68.220.241.50 port 44062 ssh2: RSA SHA256:nKFqiC1tfI89CurykR2N2Qujx/39ZzIdQ7HqDt8w/Gw Jan 23 18:29:59.786488 sshd-session[1832]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 18:29:59.790260 systemd-logind[1659]: New session 4 of user core. Jan 23 18:29:59.798069 systemd[1]: Started session-4.scope - Session 4 of User core. Jan 23 18:30:00.074690 sshd[1836]: Connection closed by 68.220.241.50 port 44062 Jan 23 18:30:00.075319 sshd-session[1832]: pam_unix(sshd:session): session closed for user core Jan 23 18:30:00.079549 systemd[1]: sshd@2-10.0.9.101:22-68.220.241.50:44062.service: Deactivated successfully. Jan 23 18:30:00.081314 systemd[1]: session-4.scope: Deactivated successfully. Jan 23 18:30:00.082532 systemd-logind[1659]: Session 4 logged out. Waiting for processes to exit. Jan 23 18:30:00.083231 systemd-logind[1659]: Removed session 4. Jan 23 18:30:02.306653 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 23 18:30:02.315816 coreos-metadata[1647]: Jan 23 18:30:02.315 WARN failed to locate config-drive, using the metadata service API instead Jan 23 18:30:02.331207 coreos-metadata[1647]: Jan 23 18:30:02.331 INFO Fetching http://169.254.169.254/openstack/2012-08-10/meta_data.json: Attempt #1 Jan 23 18:30:02.713629 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 23 18:30:02.719378 coreos-metadata[1733]: Jan 23 18:30:02.719 WARN failed to locate config-drive, using the metadata service API instead Jan 23 18:30:02.731827 coreos-metadata[1733]: Jan 23 18:30:02.731 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 Jan 23 18:30:03.625860 coreos-metadata[1647]: Jan 23 18:30:03.625 INFO Fetch successful Jan 23 18:30:03.625860 coreos-metadata[1647]: Jan 23 18:30:03.625 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Jan 23 18:30:04.200589 coreos-metadata[1733]: Jan 23 18:30:04.200 INFO Fetch successful Jan 23 18:30:04.200589 coreos-metadata[1733]: Jan 23 18:30:04.200 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 Jan 23 18:30:04.813912 coreos-metadata[1647]: Jan 23 18:30:04.813 INFO Fetch successful Jan 23 18:30:04.813912 coreos-metadata[1647]: Jan 23 18:30:04.813 INFO Fetching http://169.254.169.254/latest/meta-data/instance-id: Attempt #1 Jan 23 18:30:06.791762 coreos-metadata[1733]: Jan 23 18:30:06.791 INFO Fetch successful Jan 23 18:30:06.793982 unknown[1733]: wrote ssh authorized keys file for user: core Jan 23 18:30:06.817770 update-ssh-keys[1850]: Updated "/home/core/.ssh/authorized_keys" Jan 23 18:30:06.818582 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Jan 23 18:30:06.819984 systemd[1]: Finished sshkeys.service. Jan 23 18:30:07.666301 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Jan 23 18:30:07.667862 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 23 18:30:07.790916 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 23 18:30:07.801920 (kubelet)[1861]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 23 18:30:07.845117 kubelet[1861]: E0123 18:30:07.845055 1861 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 23 18:30:07.848259 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 23 18:30:07.848376 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 23 18:30:07.848905 systemd[1]: kubelet.service: Consumed 139ms CPU time, 109M memory peak. Jan 23 18:30:07.918195 coreos-metadata[1647]: Jan 23 18:30:07.918 INFO Fetch successful Jan 23 18:30:07.918195 coreos-metadata[1647]: Jan 23 18:30:07.918 INFO Fetching http://169.254.169.254/latest/meta-data/instance-type: Attempt #1 Jan 23 18:30:08.522745 coreos-metadata[1647]: Jan 23 18:30:08.522 INFO Fetch successful Jan 23 18:30:08.522745 coreos-metadata[1647]: Jan 23 18:30:08.522 INFO Fetching http://169.254.169.254/latest/meta-data/local-ipv4: Attempt #1 Jan 23 18:30:10.184951 systemd[1]: Started sshd@3-10.0.9.101:22-68.220.241.50:53882.service - OpenSSH per-connection server daemon (68.220.241.50:53882). Jan 23 18:30:10.186093 coreos-metadata[1647]: Jan 23 18:30:10.186 INFO Fetch successful Jan 23 18:30:10.186285 coreos-metadata[1647]: Jan 23 18:30:10.186 INFO Fetching http://169.254.169.254/latest/meta-data/public-ipv4: Attempt #1 Jan 23 18:30:10.706716 sshd[1869]: Accepted publickey for core from 68.220.241.50 port 53882 ssh2: RSA SHA256:nKFqiC1tfI89CurykR2N2Qujx/39ZzIdQ7HqDt8w/Gw Jan 23 18:30:10.707802 sshd-session[1869]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 18:30:10.711792 systemd-logind[1659]: New session 5 of user core. Jan 23 18:30:10.721938 systemd[1]: Started session-5.scope - Session 5 of User core. Jan 23 18:30:10.766489 coreos-metadata[1647]: Jan 23 18:30:10.766 INFO Fetch successful Jan 23 18:30:10.795315 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Jan 23 18:30:10.795715 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Jan 23 18:30:10.795913 systemd[1]: Reached target multi-user.target - Multi-User System. Jan 23 18:30:10.796039 systemd[1]: Startup finished in 3.531s (kernel) + 15.762s (initrd) + 18.299s (userspace) = 37.593s. Jan 23 18:30:10.996130 sshd[1873]: Connection closed by 68.220.241.50 port 53882 Jan 23 18:30:10.996755 sshd-session[1869]: pam_unix(sshd:session): session closed for user core Jan 23 18:30:11.000240 systemd[1]: sshd@3-10.0.9.101:22-68.220.241.50:53882.service: Deactivated successfully. Jan 23 18:30:11.001563 systemd[1]: session-5.scope: Deactivated successfully. Jan 23 18:30:11.002104 systemd-logind[1659]: Session 5 logged out. Waiting for processes to exit. Jan 23 18:30:11.003118 systemd-logind[1659]: Removed session 5. Jan 23 18:30:11.112457 systemd[1]: Started sshd@4-10.0.9.101:22-68.220.241.50:34530.service - OpenSSH per-connection server daemon (68.220.241.50:34530). Jan 23 18:30:11.632164 sshd[1884]: Accepted publickey for core from 68.220.241.50 port 34530 ssh2: RSA SHA256:nKFqiC1tfI89CurykR2N2Qujx/39ZzIdQ7HqDt8w/Gw Jan 23 18:30:11.633334 sshd-session[1884]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 18:30:11.641197 systemd-logind[1659]: New session 6 of user core. Jan 23 18:30:11.651124 systemd[1]: Started session-6.scope - Session 6 of User core. Jan 23 18:30:11.917648 sshd[1888]: Connection closed by 68.220.241.50 port 34530 Jan 23 18:30:11.918162 sshd-session[1884]: pam_unix(sshd:session): session closed for user core Jan 23 18:30:11.921648 systemd[1]: sshd@4-10.0.9.101:22-68.220.241.50:34530.service: Deactivated successfully. Jan 23 18:30:11.923023 systemd[1]: session-6.scope: Deactivated successfully. Jan 23 18:30:11.924108 systemd-logind[1659]: Session 6 logged out. Waiting for processes to exit. Jan 23 18:30:11.925010 systemd-logind[1659]: Removed session 6. Jan 23 18:30:12.022231 systemd[1]: Started sshd@5-10.0.9.101:22-68.220.241.50:34540.service - OpenSSH per-connection server daemon (68.220.241.50:34540). Jan 23 18:30:12.538397 sshd[1894]: Accepted publickey for core from 68.220.241.50 port 34540 ssh2: RSA SHA256:nKFqiC1tfI89CurykR2N2Qujx/39ZzIdQ7HqDt8w/Gw Jan 23 18:30:12.539512 sshd-session[1894]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 18:30:12.545526 systemd-logind[1659]: New session 7 of user core. Jan 23 18:30:12.557810 systemd[1]: Started session-7.scope - Session 7 of User core. Jan 23 18:30:12.827830 sshd[1898]: Connection closed by 68.220.241.50 port 34540 Jan 23 18:30:12.828354 sshd-session[1894]: pam_unix(sshd:session): session closed for user core Jan 23 18:30:12.831554 systemd[1]: sshd@5-10.0.9.101:22-68.220.241.50:34540.service: Deactivated successfully. Jan 23 18:30:12.832893 systemd[1]: session-7.scope: Deactivated successfully. Jan 23 18:30:12.833565 systemd-logind[1659]: Session 7 logged out. Waiting for processes to exit. Jan 23 18:30:12.835103 systemd-logind[1659]: Removed session 7. Jan 23 18:30:12.944493 systemd[1]: Started sshd@6-10.0.9.101:22-68.220.241.50:34550.service - OpenSSH per-connection server daemon (68.220.241.50:34550). Jan 23 18:30:13.463767 sshd[1904]: Accepted publickey for core from 68.220.241.50 port 34550 ssh2: RSA SHA256:nKFqiC1tfI89CurykR2N2Qujx/39ZzIdQ7HqDt8w/Gw Jan 23 18:30:13.464876 sshd-session[1904]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 18:30:13.468792 systemd-logind[1659]: New session 8 of user core. Jan 23 18:30:13.478975 systemd[1]: Started session-8.scope - Session 8 of User core. Jan 23 18:30:13.681082 sudo[1909]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Jan 23 18:30:13.681365 sudo[1909]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 23 18:30:13.695461 sudo[1909]: pam_unix(sudo:session): session closed for user root Jan 23 18:30:13.792376 sshd[1908]: Connection closed by 68.220.241.50 port 34550 Jan 23 18:30:13.791711 sshd-session[1904]: pam_unix(sshd:session): session closed for user core Jan 23 18:30:13.795118 systemd[1]: sshd@6-10.0.9.101:22-68.220.241.50:34550.service: Deactivated successfully. Jan 23 18:30:13.796350 systemd[1]: session-8.scope: Deactivated successfully. Jan 23 18:30:13.797443 systemd-logind[1659]: Session 8 logged out. Waiting for processes to exit. Jan 23 18:30:13.798024 systemd-logind[1659]: Removed session 8. Jan 23 18:30:13.907557 systemd[1]: Started sshd@7-10.0.9.101:22-68.220.241.50:34554.service - OpenSSH per-connection server daemon (68.220.241.50:34554). Jan 23 18:30:14.419946 sshd[1916]: Accepted publickey for core from 68.220.241.50 port 34554 ssh2: RSA SHA256:nKFqiC1tfI89CurykR2N2Qujx/39ZzIdQ7HqDt8w/Gw Jan 23 18:30:14.421071 sshd-session[1916]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 18:30:14.424736 systemd-logind[1659]: New session 9 of user core. Jan 23 18:30:14.430760 systemd[1]: Started session-9.scope - Session 9 of User core. Jan 23 18:30:14.616185 sudo[1922]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Jan 23 18:30:14.616420 sudo[1922]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 23 18:30:14.620625 sudo[1922]: pam_unix(sudo:session): session closed for user root Jan 23 18:30:14.625763 sudo[1921]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Jan 23 18:30:14.625985 sudo[1921]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 23 18:30:14.632261 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 23 18:30:14.670000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Jan 23 18:30:14.671674 kernel: kauditd_printk_skb: 188 callbacks suppressed Jan 23 18:30:14.671718 kernel: audit: type=1305 audit(1769193014.670:234): auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Jan 23 18:30:14.671889 augenrules[1946]: No rules Jan 23 18:30:14.670000 audit[1946]: SYSCALL arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7ffdd8951f10 a2=420 a3=0 items=0 ppid=1927 pid=1946 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:14.674153 systemd[1]: audit-rules.service: Deactivated successfully. Jan 23 18:30:14.674448 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 23 18:30:14.676528 kernel: audit: type=1300 audit(1769193014.670:234): arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7ffdd8951f10 a2=420 a3=0 items=0 ppid=1927 pid=1946 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:14.670000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 23 18:30:14.677216 sudo[1921]: pam_unix(sudo:session): session closed for user root Jan 23 18:30:14.672000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:14.680375 kernel: audit: type=1327 audit(1769193014.670:234): proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 23 18:30:14.680406 kernel: audit: type=1130 audit(1769193014.672:235): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:14.680420 kernel: audit: type=1131 audit(1769193014.672:236): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:14.672000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:14.676000 audit[1921]: USER_END pid=1921 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 23 18:30:14.684544 kernel: audit: type=1106 audit(1769193014.676:237): pid=1921 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 23 18:30:14.684573 kernel: audit: type=1104 audit(1769193014.676:238): pid=1921 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 23 18:30:14.676000 audit[1921]: CRED_DISP pid=1921 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 23 18:30:14.772737 sshd[1920]: Connection closed by 68.220.241.50 port 34554 Jan 23 18:30:14.773168 sshd-session[1916]: pam_unix(sshd:session): session closed for user core Jan 23 18:30:14.773000 audit[1916]: USER_END pid=1916 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:30:14.779642 kernel: audit: type=1106 audit(1769193014.773:239): pid=1916 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:30:14.778916 systemd[1]: sshd@7-10.0.9.101:22-68.220.241.50:34554.service: Deactivated successfully. Jan 23 18:30:14.779767 systemd-logind[1659]: Session 9 logged out. Waiting for processes to exit. Jan 23 18:30:14.773000 audit[1916]: CRED_DISP pid=1916 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:30:14.780825 systemd[1]: session-9.scope: Deactivated successfully. Jan 23 18:30:14.782619 kernel: audit: type=1104 audit(1769193014.773:240): pid=1916 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:30:14.778000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-10.0.9.101:22-68.220.241.50:34554 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:14.783851 systemd-logind[1659]: Removed session 9. Jan 23 18:30:14.789772 kernel: audit: type=1131 audit(1769193014.778:241): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-10.0.9.101:22-68.220.241.50:34554 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:14.881431 systemd[1]: Started sshd@8-10.0.9.101:22-68.220.241.50:34570.service - OpenSSH per-connection server daemon (68.220.241.50:34570). Jan 23 18:30:14.881000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-10.0.9.101:22-68.220.241.50:34570 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:15.412000 audit[1955]: USER_ACCT pid=1955 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:30:15.413067 sshd[1955]: Accepted publickey for core from 68.220.241.50 port 34570 ssh2: RSA SHA256:nKFqiC1tfI89CurykR2N2Qujx/39ZzIdQ7HqDt8w/Gw Jan 23 18:30:15.413000 audit[1955]: CRED_ACQ pid=1955 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:30:15.413000 audit[1955]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc26043cc0 a2=3 a3=0 items=0 ppid=1 pid=1955 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=10 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:15.413000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 18:30:15.414642 sshd-session[1955]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 18:30:15.418776 systemd-logind[1659]: New session 10 of user core. Jan 23 18:30:15.426060 systemd[1]: Started session-10.scope - Session 10 of User core. Jan 23 18:30:15.428000 audit[1955]: USER_START pid=1955 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:30:15.430000 audit[1959]: CRED_ACQ pid=1959 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:30:15.615000 audit[1960]: USER_ACCT pid=1960 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_unix,pam_faillock acct="core" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 23 18:30:15.615000 audit[1960]: CRED_REFR pid=1960 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 23 18:30:15.615000 audit[1960]: USER_START pid=1960 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 23 18:30:15.615994 sudo[1960]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Jan 23 18:30:15.616260 sudo[1960]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 23 18:30:16.030288 systemd[1]: Starting docker.service - Docker Application Container Engine... Jan 23 18:30:16.044886 (dockerd)[1978]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Jan 23 18:30:16.342482 dockerd[1978]: time="2026-01-23T18:30:16.342114379Z" level=info msg="Starting up" Jan 23 18:30:16.344006 dockerd[1978]: time="2026-01-23T18:30:16.343983246Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Jan 23 18:30:16.357807 dockerd[1978]: time="2026-01-23T18:30:16.357731433Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Jan 23 18:30:16.372710 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport3337550109-merged.mount: Deactivated successfully. Jan 23 18:30:16.391053 systemd[1]: var-lib-docker-metacopy\x2dcheck1411240556-merged.mount: Deactivated successfully. Jan 23 18:30:16.406248 dockerd[1978]: time="2026-01-23T18:30:16.406206064Z" level=info msg="Loading containers: start." Jan 23 18:30:16.417640 kernel: Initializing XFRM netlink socket Jan 23 18:30:16.471000 audit[2027]: NETFILTER_CFG table=nat:2 family=2 entries=2 op=nft_register_chain pid=2027 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:16.471000 audit[2027]: SYSCALL arch=c000003e syscall=46 success=yes exit=116 a0=3 a1=7ffca9b57f00 a2=0 a3=0 items=0 ppid=1978 pid=2027 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:16.471000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Jan 23 18:30:16.473000 audit[2029]: NETFILTER_CFG table=filter:3 family=2 entries=2 op=nft_register_chain pid=2029 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:16.473000 audit[2029]: SYSCALL arch=c000003e syscall=46 success=yes exit=124 a0=3 a1=7ffeb73e28f0 a2=0 a3=0 items=0 ppid=1978 pid=2029 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:16.473000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Jan 23 18:30:16.476000 audit[2031]: NETFILTER_CFG table=filter:4 family=2 entries=1 op=nft_register_chain pid=2031 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:16.476000 audit[2031]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fffaed8b890 a2=0 a3=0 items=0 ppid=1978 pid=2031 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:16.476000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D464F5257415244 Jan 23 18:30:16.478000 audit[2033]: NETFILTER_CFG table=filter:5 family=2 entries=1 op=nft_register_chain pid=2033 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:16.478000 audit[2033]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffcd2bcc4a0 a2=0 a3=0 items=0 ppid=1978 pid=2033 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:16.478000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D425249444745 Jan 23 18:30:16.480000 audit[2035]: NETFILTER_CFG table=filter:6 family=2 entries=1 op=nft_register_chain pid=2035 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:16.480000 audit[2035]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffd761982c0 a2=0 a3=0 items=0 ppid=1978 pid=2035 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:16.480000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D4354 Jan 23 18:30:16.481000 audit[2037]: NETFILTER_CFG table=filter:7 family=2 entries=1 op=nft_register_chain pid=2037 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:16.481000 audit[2037]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7ffd14d99ed0 a2=0 a3=0 items=0 ppid=1978 pid=2037 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:16.481000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 23 18:30:16.483000 audit[2039]: NETFILTER_CFG table=filter:8 family=2 entries=1 op=nft_register_chain pid=2039 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:16.483000 audit[2039]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7fff2cdf8b90 a2=0 a3=0 items=0 ppid=1978 pid=2039 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:16.483000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 23 18:30:16.485000 audit[2041]: NETFILTER_CFG table=nat:9 family=2 entries=2 op=nft_register_chain pid=2041 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:16.485000 audit[2041]: SYSCALL arch=c000003e syscall=46 success=yes exit=384 a0=3 a1=7ffe3fc0a7e0 a2=0 a3=0 items=0 ppid=1978 pid=2041 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:16.485000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Jan 23 18:30:16.515000 audit[2044]: NETFILTER_CFG table=nat:10 family=2 entries=2 op=nft_register_chain pid=2044 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:16.515000 audit[2044]: SYSCALL arch=c000003e syscall=46 success=yes exit=472 a0=3 a1=7ffe145c5f60 a2=0 a3=0 items=0 ppid=1978 pid=2044 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:16.515000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003132372E302E302E302F38 Jan 23 18:30:16.517000 audit[2046]: NETFILTER_CFG table=filter:11 family=2 entries=2 op=nft_register_chain pid=2046 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:16.517000 audit[2046]: SYSCALL arch=c000003e syscall=46 success=yes exit=340 a0=3 a1=7ffcf041b540 a2=0 a3=0 items=0 ppid=1978 pid=2046 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:16.517000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D464F5257415244 Jan 23 18:30:16.519000 audit[2048]: NETFILTER_CFG table=filter:12 family=2 entries=1 op=nft_register_rule pid=2048 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:16.519000 audit[2048]: SYSCALL arch=c000003e syscall=46 success=yes exit=236 a0=3 a1=7ffd3d13d720 a2=0 a3=0 items=0 ppid=1978 pid=2048 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:16.519000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D425249444745 Jan 23 18:30:16.521000 audit[2050]: NETFILTER_CFG table=filter:13 family=2 entries=1 op=nft_register_rule pid=2050 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:16.521000 audit[2050]: SYSCALL arch=c000003e syscall=46 success=yes exit=248 a0=3 a1=7ffd5b04bb70 a2=0 a3=0 items=0 ppid=1978 pid=2050 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:16.521000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 23 18:30:16.523000 audit[2052]: NETFILTER_CFG table=filter:14 family=2 entries=1 op=nft_register_rule pid=2052 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:16.523000 audit[2052]: SYSCALL arch=c000003e syscall=46 success=yes exit=232 a0=3 a1=7ffc2d0b6f30 a2=0 a3=0 items=0 ppid=1978 pid=2052 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:16.523000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D4354 Jan 23 18:30:16.557000 audit[2082]: NETFILTER_CFG table=nat:15 family=10 entries=2 op=nft_register_chain pid=2082 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:30:16.557000 audit[2082]: SYSCALL arch=c000003e syscall=46 success=yes exit=116 a0=3 a1=7fffcbb92510 a2=0 a3=0 items=0 ppid=1978 pid=2082 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:16.557000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Jan 23 18:30:16.559000 audit[2084]: NETFILTER_CFG table=filter:16 family=10 entries=2 op=nft_register_chain pid=2084 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:30:16.559000 audit[2084]: SYSCALL arch=c000003e syscall=46 success=yes exit=124 a0=3 a1=7ffc2c6712f0 a2=0 a3=0 items=0 ppid=1978 pid=2084 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:16.559000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Jan 23 18:30:16.561000 audit[2086]: NETFILTER_CFG table=filter:17 family=10 entries=1 op=nft_register_chain pid=2086 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:30:16.561000 audit[2086]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffd8cb9e220 a2=0 a3=0 items=0 ppid=1978 pid=2086 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:16.561000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D464F5257415244 Jan 23 18:30:16.563000 audit[2088]: NETFILTER_CFG table=filter:18 family=10 entries=1 op=nft_register_chain pid=2088 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:30:16.563000 audit[2088]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffffd8b94a0 a2=0 a3=0 items=0 ppid=1978 pid=2088 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:16.563000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D425249444745 Jan 23 18:30:16.565000 audit[2090]: NETFILTER_CFG table=filter:19 family=10 entries=1 op=nft_register_chain pid=2090 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:30:16.565000 audit[2090]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7fff7345e9c0 a2=0 a3=0 items=0 ppid=1978 pid=2090 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:16.565000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D4354 Jan 23 18:30:16.567000 audit[2092]: NETFILTER_CFG table=filter:20 family=10 entries=1 op=nft_register_chain pid=2092 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:30:16.567000 audit[2092]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7ffc14125fa0 a2=0 a3=0 items=0 ppid=1978 pid=2092 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:16.567000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 23 18:30:16.569000 audit[2094]: NETFILTER_CFG table=filter:21 family=10 entries=1 op=nft_register_chain pid=2094 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:30:16.569000 audit[2094]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7ffc9cfb9f50 a2=0 a3=0 items=0 ppid=1978 pid=2094 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:16.569000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 23 18:30:16.571000 audit[2096]: NETFILTER_CFG table=nat:22 family=10 entries=2 op=nft_register_chain pid=2096 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:30:16.571000 audit[2096]: SYSCALL arch=c000003e syscall=46 success=yes exit=384 a0=3 a1=7ffef3ddf760 a2=0 a3=0 items=0 ppid=1978 pid=2096 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:16.571000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Jan 23 18:30:16.574000 audit[2098]: NETFILTER_CFG table=nat:23 family=10 entries=2 op=nft_register_chain pid=2098 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:30:16.574000 audit[2098]: SYSCALL arch=c000003e syscall=46 success=yes exit=484 a0=3 a1=7ffcc89e75a0 a2=0 a3=0 items=0 ppid=1978 pid=2098 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:16.574000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003A3A312F313238 Jan 23 18:30:16.576000 audit[2100]: NETFILTER_CFG table=filter:24 family=10 entries=2 op=nft_register_chain pid=2100 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:30:16.576000 audit[2100]: SYSCALL arch=c000003e syscall=46 success=yes exit=340 a0=3 a1=7ffc416bcdd0 a2=0 a3=0 items=0 ppid=1978 pid=2100 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:16.576000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D464F5257415244 Jan 23 18:30:16.578000 audit[2102]: NETFILTER_CFG table=filter:25 family=10 entries=1 op=nft_register_rule pid=2102 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:30:16.578000 audit[2102]: SYSCALL arch=c000003e syscall=46 success=yes exit=236 a0=3 a1=7fffdf578e50 a2=0 a3=0 items=0 ppid=1978 pid=2102 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:16.578000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D425249444745 Jan 23 18:30:16.580000 audit[2104]: NETFILTER_CFG table=filter:26 family=10 entries=1 op=nft_register_rule pid=2104 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:30:16.580000 audit[2104]: SYSCALL arch=c000003e syscall=46 success=yes exit=248 a0=3 a1=7fff48ac2cc0 a2=0 a3=0 items=0 ppid=1978 pid=2104 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:16.580000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 23 18:30:16.582000 audit[2106]: NETFILTER_CFG table=filter:27 family=10 entries=1 op=nft_register_rule pid=2106 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:30:16.582000 audit[2106]: SYSCALL arch=c000003e syscall=46 success=yes exit=232 a0=3 a1=7ffd5d3563c0 a2=0 a3=0 items=0 ppid=1978 pid=2106 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:16.582000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D4354 Jan 23 18:30:16.587000 audit[2111]: NETFILTER_CFG table=filter:28 family=2 entries=1 op=nft_register_chain pid=2111 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:16.587000 audit[2111]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffc1381eb60 a2=0 a3=0 items=0 ppid=1978 pid=2111 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:16.587000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Jan 23 18:30:16.589000 audit[2113]: NETFILTER_CFG table=filter:29 family=2 entries=1 op=nft_register_rule pid=2113 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:16.589000 audit[2113]: SYSCALL arch=c000003e syscall=46 success=yes exit=212 a0=3 a1=7ffd78fd8b70 a2=0 a3=0 items=0 ppid=1978 pid=2113 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:16.589000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Jan 23 18:30:16.591000 audit[2115]: NETFILTER_CFG table=filter:30 family=2 entries=1 op=nft_register_rule pid=2115 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:16.591000 audit[2115]: SYSCALL arch=c000003e syscall=46 success=yes exit=224 a0=3 a1=7ffd122dd360 a2=0 a3=0 items=0 ppid=1978 pid=2115 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:16.591000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Jan 23 18:30:16.593000 audit[2117]: NETFILTER_CFG table=filter:31 family=10 entries=1 op=nft_register_chain pid=2117 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:30:16.593000 audit[2117]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffec72a5c10 a2=0 a3=0 items=0 ppid=1978 pid=2117 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:16.593000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Jan 23 18:30:16.595000 audit[2119]: NETFILTER_CFG table=filter:32 family=10 entries=1 op=nft_register_rule pid=2119 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:30:16.595000 audit[2119]: SYSCALL arch=c000003e syscall=46 success=yes exit=212 a0=3 a1=7ffda2a51ee0 a2=0 a3=0 items=0 ppid=1978 pid=2119 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:16.595000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Jan 23 18:30:16.597000 audit[2121]: NETFILTER_CFG table=filter:33 family=10 entries=1 op=nft_register_rule pid=2121 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:30:16.597000 audit[2121]: SYSCALL arch=c000003e syscall=46 success=yes exit=224 a0=3 a1=7ffdeff98af0 a2=0 a3=0 items=0 ppid=1978 pid=2121 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:16.597000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Jan 23 18:30:16.620000 audit[2127]: NETFILTER_CFG table=nat:34 family=2 entries=2 op=nft_register_chain pid=2127 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:16.620000 audit[2127]: SYSCALL arch=c000003e syscall=46 success=yes exit=520 a0=3 a1=7ffd210f87e0 a2=0 a3=0 items=0 ppid=1978 pid=2127 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:16.620000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4900504F5354524F5554494E47002D73003137322E31372E302E302F31360000002D6F00646F636B657230002D6A004D415351554552414445 Jan 23 18:30:16.622000 audit[2129]: NETFILTER_CFG table=nat:35 family=2 entries=1 op=nft_register_rule pid=2129 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:16.622000 audit[2129]: SYSCALL arch=c000003e syscall=46 success=yes exit=288 a0=3 a1=7ffc940ff840 a2=0 a3=0 items=0 ppid=1978 pid=2129 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:16.622000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4900444F434B4552002D6900646F636B657230002D6A0052455455524E Jan 23 18:30:16.631000 audit[2137]: NETFILTER_CFG table=filter:36 family=2 entries=1 op=nft_register_rule pid=2137 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:16.631000 audit[2137]: SYSCALL arch=c000003e syscall=46 success=yes exit=300 a0=3 a1=7fff3ad3b510 a2=0 a3=0 items=0 ppid=1978 pid=2137 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:16.631000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D464F5257415244002D6900646F636B657230002D6A00414343455054 Jan 23 18:30:16.640000 audit[2143]: NETFILTER_CFG table=filter:37 family=2 entries=1 op=nft_register_rule pid=2143 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:16.640000 audit[2143]: SYSCALL arch=c000003e syscall=46 success=yes exit=376 a0=3 a1=7ffcb94173e0 a2=0 a3=0 items=0 ppid=1978 pid=2143 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:16.640000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45520000002D6900646F636B657230002D6F00646F636B657230002D6A0044524F50 Jan 23 18:30:16.642000 audit[2145]: NETFILTER_CFG table=filter:38 family=2 entries=1 op=nft_register_rule pid=2145 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:16.642000 audit[2145]: SYSCALL arch=c000003e syscall=46 success=yes exit=512 a0=3 a1=7ffef18f42e0 a2=0 a3=0 items=0 ppid=1978 pid=2145 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:16.642000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D4354002D6F00646F636B657230002D6D00636F6E6E747261636B002D2D637473746174650052454C415445442C45535441424C4953484544002D6A00414343455054 Jan 23 18:30:16.644000 audit[2147]: NETFILTER_CFG table=filter:39 family=2 entries=1 op=nft_register_rule pid=2147 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:16.644000 audit[2147]: SYSCALL arch=c000003e syscall=46 success=yes exit=312 a0=3 a1=7fff2e4b6090 a2=0 a3=0 items=0 ppid=1978 pid=2147 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:16.644000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D425249444745002D6F00646F636B657230002D6A00444F434B4552 Jan 23 18:30:16.646000 audit[2149]: NETFILTER_CFG table=filter:40 family=2 entries=1 op=nft_register_rule pid=2149 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:16.646000 audit[2149]: SYSCALL arch=c000003e syscall=46 success=yes exit=428 a0=3 a1=7ffd64f58b10 a2=0 a3=0 items=0 ppid=1978 pid=2149 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:16.646000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D49534F4C4154494F4E2D53544147452D31002D6900646F636B6572300000002D6F00646F636B657230002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 23 18:30:16.648000 audit[2151]: NETFILTER_CFG table=filter:41 family=2 entries=1 op=nft_register_rule pid=2151 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:16.648000 audit[2151]: SYSCALL arch=c000003e syscall=46 success=yes exit=312 a0=3 a1=7ffd7a335f20 a2=0 a3=0 items=0 ppid=1978 pid=2151 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:16.648000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4900444F434B45522D49534F4C4154494F4E2D53544147452D32002D6F00646F636B657230002D6A0044524F50 Jan 23 18:30:16.649964 systemd-networkd[1583]: docker0: Link UP Jan 23 18:30:16.655048 dockerd[1978]: time="2026-01-23T18:30:16.655005475Z" level=info msg="Loading containers: done." Jan 23 18:30:16.678274 dockerd[1978]: time="2026-01-23T18:30:16.678213216Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Jan 23 18:30:16.678473 dockerd[1978]: time="2026-01-23T18:30:16.678321357Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Jan 23 18:30:16.678473 dockerd[1978]: time="2026-01-23T18:30:16.678414471Z" level=info msg="Initializing buildkit" Jan 23 18:30:16.700324 dockerd[1978]: time="2026-01-23T18:30:16.700263379Z" level=info msg="Completed buildkit initialization" Jan 23 18:30:16.707624 dockerd[1978]: time="2026-01-23T18:30:16.707574122Z" level=info msg="Daemon has completed initialization" Jan 23 18:30:16.707841 dockerd[1978]: time="2026-01-23T18:30:16.707759270Z" level=info msg="API listen on /run/docker.sock" Jan 23 18:30:16.708052 systemd[1]: Started docker.service - Docker Application Container Engine. Jan 23 18:30:16.708000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=docker comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:17.880268 containerd[1690]: time="2026-01-23T18:30:17.880220711Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.11\"" Jan 23 18:30:17.916157 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Jan 23 18:30:17.919778 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 23 18:30:18.043224 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 23 18:30:18.042000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:18.053963 (kubelet)[2197]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 23 18:30:18.102777 kubelet[2197]: E0123 18:30:18.102743 2197 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 23 18:30:18.105794 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 23 18:30:18.105916 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 23 18:30:18.105000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 23 18:30:18.106227 systemd[1]: kubelet.service: Consumed 134ms CPU time, 111M memory peak. Jan 23 18:30:18.644027 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1784030917.mount: Deactivated successfully. Jan 23 18:30:19.101767 chronyd[1645]: Selected source PHC0 Jan 23 18:30:19.101795 chronyd[1645]: System clock wrong by 1.687656 seconds Jan 23 18:30:20.789885 systemd-resolved[1358]: Clock change detected. Flushing caches. Jan 23 18:30:20.790226 chronyd[1645]: System clock was stepped by 1.687656 seconds Jan 23 18:30:21.126281 containerd[1690]: time="2026-01-23T18:30:21.125447294Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.32.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 18:30:21.127640 containerd[1690]: time="2026-01-23T18:30:21.127616511Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.32.11: active requests=0, bytes read=27401903" Jan 23 18:30:21.128430 containerd[1690]: time="2026-01-23T18:30:21.128411999Z" level=info msg="ImageCreate event name:\"sha256:7757c58248a29fc7474a8072796848689852b0477adf16765f38b3d1a9bacadf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 18:30:21.131138 containerd[1690]: time="2026-01-23T18:30:21.131116822Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:41eaecaed9af0ca8ab36d7794819c7df199e68c6c6ee0649114d713c495f8bd5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 18:30:21.132540 containerd[1690]: time="2026-01-23T18:30:21.132518417Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.32.11\" with image id \"sha256:7757c58248a29fc7474a8072796848689852b0477adf16765f38b3d1a9bacadf\", repo tag \"registry.k8s.io/kube-apiserver:v1.32.11\", repo digest \"registry.k8s.io/kube-apiserver@sha256:41eaecaed9af0ca8ab36d7794819c7df199e68c6c6ee0649114d713c495f8bd5\", size \"29067246\" in 1.564603356s" Jan 23 18:30:21.132618 containerd[1690]: time="2026-01-23T18:30:21.132608063Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.11\" returns image reference \"sha256:7757c58248a29fc7474a8072796848689852b0477adf16765f38b3d1a9bacadf\"" Jan 23 18:30:21.133321 containerd[1690]: time="2026-01-23T18:30:21.133306473Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.11\"" Jan 23 18:30:22.561099 containerd[1690]: time="2026-01-23T18:30:22.560373197Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.32.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 18:30:22.561553 containerd[1690]: time="2026-01-23T18:30:22.561533186Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.32.11: active requests=0, bytes read=0" Jan 23 18:30:22.562207 containerd[1690]: time="2026-01-23T18:30:22.562191797Z" level=info msg="ImageCreate event name:\"sha256:0175d0a8243db520e3caa6d5c1e4248fddbc32447a9e8b5f4630831bc1e2489e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 18:30:22.564227 containerd[1690]: time="2026-01-23T18:30:22.564203883Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:ce7b2ead5eef1a1554ef28b2b79596c6a8c6d506a87a7ab1381e77fe3d72f55f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 18:30:22.565209 containerd[1690]: time="2026-01-23T18:30:22.565189093Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.32.11\" with image id \"sha256:0175d0a8243db520e3caa6d5c1e4248fddbc32447a9e8b5f4630831bc1e2489e\", repo tag \"registry.k8s.io/kube-controller-manager:v1.32.11\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:ce7b2ead5eef1a1554ef28b2b79596c6a8c6d506a87a7ab1381e77fe3d72f55f\", size \"26650388\" in 1.431754111s" Jan 23 18:30:22.565475 containerd[1690]: time="2026-01-23T18:30:22.565462824Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.11\" returns image reference \"sha256:0175d0a8243db520e3caa6d5c1e4248fddbc32447a9e8b5f4630831bc1e2489e\"" Jan 23 18:30:22.566108 containerd[1690]: time="2026-01-23T18:30:22.566094194Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.11\"" Jan 23 18:30:23.572516 containerd[1690]: time="2026-01-23T18:30:23.572443398Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.32.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 18:30:23.574951 containerd[1690]: time="2026-01-23T18:30:23.574923941Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.32.11: active requests=0, bytes read=0" Jan 23 18:30:23.575791 containerd[1690]: time="2026-01-23T18:30:23.575767170Z" level=info msg="ImageCreate event name:\"sha256:23d6a1fb92fda53b787f364351c610e55f073e8bdf0de5831974df7875b13f21\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 18:30:23.578675 containerd[1690]: time="2026-01-23T18:30:23.578651967Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:b3039587bbe70e61a6aeaff56c21fdeeef104524a31f835bcc80887d40b8e6b2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 18:30:23.579859 containerd[1690]: time="2026-01-23T18:30:23.579836247Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.32.11\" with image id \"sha256:23d6a1fb92fda53b787f364351c610e55f073e8bdf0de5831974df7875b13f21\", repo tag \"registry.k8s.io/kube-scheduler:v1.32.11\", repo digest \"registry.k8s.io/kube-scheduler@sha256:b3039587bbe70e61a6aeaff56c21fdeeef104524a31f835bcc80887d40b8e6b2\", size \"21062128\" in 1.01364241s" Jan 23 18:30:23.579892 containerd[1690]: time="2026-01-23T18:30:23.579861215Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.11\" returns image reference \"sha256:23d6a1fb92fda53b787f364351c610e55f073e8bdf0de5831974df7875b13f21\"" Jan 23 18:30:23.580370 containerd[1690]: time="2026-01-23T18:30:23.580344042Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.11\"" Jan 23 18:30:24.512361 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4147349357.mount: Deactivated successfully. Jan 23 18:30:24.897923 containerd[1690]: time="2026-01-23T18:30:24.897863068Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 18:30:24.899242 containerd[1690]: time="2026-01-23T18:30:24.899213927Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.11: active requests=0, bytes read=0" Jan 23 18:30:24.900078 containerd[1690]: time="2026-01-23T18:30:24.900037659Z" level=info msg="ImageCreate event name:\"sha256:4d8fb2dc5751966f058943ff7c5f10551e603d726ab8648c7c7b7f95a2663e3d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 18:30:24.902665 containerd[1690]: time="2026-01-23T18:30:24.901971732Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:4204f9136c23a867929d32046032fe069b49ad94cf168042405e7d0ec88bdba9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 18:30:24.902665 containerd[1690]: time="2026-01-23T18:30:24.902381257Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.11\" with image id \"sha256:4d8fb2dc5751966f058943ff7c5f10551e603d726ab8648c7c7b7f95a2663e3d\", repo tag \"registry.k8s.io/kube-proxy:v1.32.11\", repo digest \"registry.k8s.io/kube-proxy@sha256:4204f9136c23a867929d32046032fe069b49ad94cf168042405e7d0ec88bdba9\", size \"31160918\" in 1.322014095s" Jan 23 18:30:24.902665 containerd[1690]: time="2026-01-23T18:30:24.902404344Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.11\" returns image reference \"sha256:4d8fb2dc5751966f058943ff7c5f10551e603d726ab8648c7c7b7f95a2663e3d\"" Jan 23 18:30:24.902797 containerd[1690]: time="2026-01-23T18:30:24.902772136Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Jan 23 18:30:25.443050 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4031689282.mount: Deactivated successfully. Jan 23 18:30:26.019775 containerd[1690]: time="2026-01-23T18:30:26.019481032Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 18:30:26.020991 containerd[1690]: time="2026-01-23T18:30:26.020968250Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=0" Jan 23 18:30:26.021833 containerd[1690]: time="2026-01-23T18:30:26.021802024Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 18:30:26.025280 containerd[1690]: time="2026-01-23T18:30:26.024723486Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 18:30:26.025397 containerd[1690]: time="2026-01-23T18:30:26.025334849Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 1.122534725s" Jan 23 18:30:26.025433 containerd[1690]: time="2026-01-23T18:30:26.025406122Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" Jan 23 18:30:26.025941 containerd[1690]: time="2026-01-23T18:30:26.025925466Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Jan 23 18:30:26.470792 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2637734604.mount: Deactivated successfully. Jan 23 18:30:26.478077 containerd[1690]: time="2026-01-23T18:30:26.477512613Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 23 18:30:26.479697 containerd[1690]: time="2026-01-23T18:30:26.479677528Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Jan 23 18:30:26.480492 containerd[1690]: time="2026-01-23T18:30:26.480474429Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 23 18:30:26.483032 containerd[1690]: time="2026-01-23T18:30:26.483012505Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 23 18:30:26.483562 containerd[1690]: time="2026-01-23T18:30:26.483395897Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 457.44795ms" Jan 23 18:30:26.483634 containerd[1690]: time="2026-01-23T18:30:26.483624648Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Jan 23 18:30:26.484269 containerd[1690]: time="2026-01-23T18:30:26.484238332Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\"" Jan 23 18:30:27.094075 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2478809554.mount: Deactivated successfully. Jan 23 18:30:28.504432 containerd[1690]: time="2026-01-23T18:30:28.504384513Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.16-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 18:30:28.506101 containerd[1690]: time="2026-01-23T18:30:28.506077614Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.16-0: active requests=0, bytes read=45502580" Jan 23 18:30:28.507074 containerd[1690]: time="2026-01-23T18:30:28.507052464Z" level=info msg="ImageCreate event name:\"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 18:30:28.510124 containerd[1690]: time="2026-01-23T18:30:28.510004748Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 18:30:28.510606 containerd[1690]: time="2026-01-23T18:30:28.510478126Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.16-0\" with image id \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\", repo tag \"registry.k8s.io/etcd:3.5.16-0\", repo digest \"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\", size \"57680541\" in 2.02613313s" Jan 23 18:30:28.510606 containerd[1690]: time="2026-01-23T18:30:28.510510529Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\" returns image reference \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\"" Jan 23 18:30:29.853764 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Jan 23 18:30:29.857559 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 23 18:30:29.992000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:29.993493 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 23 18:30:29.994781 kernel: kauditd_printk_skb: 134 callbacks suppressed Jan 23 18:30:29.994831 kernel: audit: type=1130 audit(1769193029.992:294): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:30.001537 (kubelet)[2415]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 23 18:30:30.042048 kubelet[2415]: E0123 18:30:30.042015 2415 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 23 18:30:30.049357 kernel: audit: type=1131 audit(1769193030.044:295): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 23 18:30:30.044000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 23 18:30:30.045132 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 23 18:30:30.045249 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 23 18:30:30.045612 systemd[1]: kubelet.service: Consumed 134ms CPU time, 110.2M memory peak. Jan 23 18:30:31.347738 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 23 18:30:31.348439 systemd[1]: kubelet.service: Consumed 134ms CPU time, 110.2M memory peak. Jan 23 18:30:31.347000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:31.347000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:31.353287 kernel: audit: type=1130 audit(1769193031.347:296): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:31.353333 kernel: audit: type=1131 audit(1769193031.347:297): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:31.352740 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 23 18:30:31.382825 systemd[1]: Reload requested from client PID 2430 ('systemctl') (unit session-10.scope)... Jan 23 18:30:31.382837 systemd[1]: Reloading... Jan 23 18:30:31.480277 zram_generator::config[2473]: No configuration found. Jan 23 18:30:31.662303 systemd[1]: Reloading finished in 279 ms. Jan 23 18:30:31.698820 kernel: audit: type=1334 audit(1769193031.693:298): prog-id=63 op=LOAD Jan 23 18:30:31.698906 kernel: audit: type=1334 audit(1769193031.693:299): prog-id=64 op=LOAD Jan 23 18:30:31.693000 audit: BPF prog-id=63 op=LOAD Jan 23 18:30:31.693000 audit: BPF prog-id=64 op=LOAD Jan 23 18:30:31.694000 audit: BPF prog-id=43 op=UNLOAD Jan 23 18:30:31.701767 kernel: audit: type=1334 audit(1769193031.694:300): prog-id=43 op=UNLOAD Jan 23 18:30:31.701808 kernel: audit: type=1334 audit(1769193031.694:301): prog-id=44 op=UNLOAD Jan 23 18:30:31.694000 audit: BPF prog-id=44 op=UNLOAD Jan 23 18:30:31.702846 kernel: audit: type=1334 audit(1769193031.695:302): prog-id=65 op=LOAD Jan 23 18:30:31.695000 audit: BPF prog-id=65 op=LOAD Jan 23 18:30:31.703921 kernel: audit: type=1334 audit(1769193031.697:303): prog-id=60 op=UNLOAD Jan 23 18:30:31.697000 audit: BPF prog-id=60 op=UNLOAD Jan 23 18:30:31.697000 audit: BPF prog-id=66 op=LOAD Jan 23 18:30:31.697000 audit: BPF prog-id=67 op=LOAD Jan 23 18:30:31.697000 audit: BPF prog-id=61 op=UNLOAD Jan 23 18:30:31.697000 audit: BPF prog-id=62 op=UNLOAD Jan 23 18:30:31.697000 audit: BPF prog-id=68 op=LOAD Jan 23 18:30:31.697000 audit: BPF prog-id=52 op=UNLOAD Jan 23 18:30:31.697000 audit: BPF prog-id=69 op=LOAD Jan 23 18:30:31.697000 audit: BPF prog-id=70 op=LOAD Jan 23 18:30:31.697000 audit: BPF prog-id=53 op=UNLOAD Jan 23 18:30:31.697000 audit: BPF prog-id=54 op=UNLOAD Jan 23 18:30:31.698000 audit: BPF prog-id=71 op=LOAD Jan 23 18:30:31.698000 audit: BPF prog-id=55 op=UNLOAD Jan 23 18:30:31.698000 audit: BPF prog-id=72 op=LOAD Jan 23 18:30:31.698000 audit: BPF prog-id=73 op=LOAD Jan 23 18:30:31.698000 audit: BPF prog-id=56 op=UNLOAD Jan 23 18:30:31.698000 audit: BPF prog-id=57 op=UNLOAD Jan 23 18:30:31.698000 audit: BPF prog-id=74 op=LOAD Jan 23 18:30:31.698000 audit: BPF prog-id=58 op=UNLOAD Jan 23 18:30:31.700000 audit: BPF prog-id=75 op=LOAD Jan 23 18:30:31.700000 audit: BPF prog-id=48 op=UNLOAD Jan 23 18:30:31.701000 audit: BPF prog-id=76 op=LOAD Jan 23 18:30:31.701000 audit: BPF prog-id=45 op=UNLOAD Jan 23 18:30:31.701000 audit: BPF prog-id=77 op=LOAD Jan 23 18:30:31.702000 audit: BPF prog-id=78 op=LOAD Jan 23 18:30:31.702000 audit: BPF prog-id=46 op=UNLOAD Jan 23 18:30:31.702000 audit: BPF prog-id=47 op=UNLOAD Jan 23 18:30:31.702000 audit: BPF prog-id=79 op=LOAD Jan 23 18:30:31.702000 audit: BPF prog-id=49 op=UNLOAD Jan 23 18:30:31.702000 audit: BPF prog-id=80 op=LOAD Jan 23 18:30:31.702000 audit: BPF prog-id=81 op=LOAD Jan 23 18:30:31.702000 audit: BPF prog-id=50 op=UNLOAD Jan 23 18:30:31.702000 audit: BPF prog-id=51 op=UNLOAD Jan 23 18:30:31.704000 audit: BPF prog-id=82 op=LOAD Jan 23 18:30:31.704000 audit: BPF prog-id=59 op=UNLOAD Jan 23 18:30:31.724618 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Jan 23 18:30:31.724682 systemd[1]: kubelet.service: Failed with result 'signal'. Jan 23 18:30:31.724940 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 23 18:30:31.724987 systemd[1]: kubelet.service: Consumed 85ms CPU time, 98.7M memory peak. Jan 23 18:30:31.723000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 23 18:30:31.726204 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 23 18:30:31.848098 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 23 18:30:31.846000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:31.853612 (kubelet)[2529]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 23 18:30:31.891396 kubelet[2529]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 23 18:30:31.891745 kubelet[2529]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Jan 23 18:30:31.891745 kubelet[2529]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 23 18:30:31.891885 kubelet[2529]: I0123 18:30:31.891852 2529 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 23 18:30:32.550139 kubelet[2529]: I0123 18:30:32.550113 2529 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Jan 23 18:30:32.550273 kubelet[2529]: I0123 18:30:32.550250 2529 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 23 18:30:32.550539 kubelet[2529]: I0123 18:30:32.550530 2529 server.go:954] "Client rotation is on, will bootstrap in background" Jan 23 18:30:32.584413 kubelet[2529]: E0123 18:30:32.584379 2529 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.0.9.101:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.9.101:6443: connect: connection refused" logger="UnhandledError" Jan 23 18:30:32.585115 kubelet[2529]: I0123 18:30:32.585095 2529 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 23 18:30:32.593935 kubelet[2529]: I0123 18:30:32.593917 2529 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 23 18:30:32.598059 kubelet[2529]: I0123 18:30:32.597815 2529 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jan 23 18:30:32.598841 kubelet[2529]: I0123 18:30:32.598807 2529 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 23 18:30:32.599062 kubelet[2529]: I0123 18:30:32.598915 2529 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4547-1-0-2-32611d5cc2","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 23 18:30:32.599181 kubelet[2529]: I0123 18:30:32.599174 2529 topology_manager.go:138] "Creating topology manager with none policy" Jan 23 18:30:32.599218 kubelet[2529]: I0123 18:30:32.599214 2529 container_manager_linux.go:304] "Creating device plugin manager" Jan 23 18:30:32.599362 kubelet[2529]: I0123 18:30:32.599355 2529 state_mem.go:36] "Initialized new in-memory state store" Jan 23 18:30:32.605748 kubelet[2529]: I0123 18:30:32.605732 2529 kubelet.go:446] "Attempting to sync node with API server" Jan 23 18:30:32.606034 kubelet[2529]: I0123 18:30:32.605821 2529 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 23 18:30:32.606034 kubelet[2529]: I0123 18:30:32.605845 2529 kubelet.go:352] "Adding apiserver pod source" Jan 23 18:30:32.606034 kubelet[2529]: I0123 18:30:32.605854 2529 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 23 18:30:32.618366 kubelet[2529]: W0123 18:30:32.618326 2529 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.9.101:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4547-1-0-2-32611d5cc2&limit=500&resourceVersion=0": dial tcp 10.0.9.101:6443: connect: connection refused Jan 23 18:30:32.618496 kubelet[2529]: E0123 18:30:32.618483 2529 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.0.9.101:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4547-1-0-2-32611d5cc2&limit=500&resourceVersion=0\": dial tcp 10.0.9.101:6443: connect: connection refused" logger="UnhandledError" Jan 23 18:30:32.618619 kubelet[2529]: W0123 18:30:32.618599 2529 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.9.101:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.0.9.101:6443: connect: connection refused Jan 23 18:30:32.618666 kubelet[2529]: E0123 18:30:32.618658 2529 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.0.9.101:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.9.101:6443: connect: connection refused" logger="UnhandledError" Jan 23 18:30:32.619610 kubelet[2529]: I0123 18:30:32.618787 2529 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.1.5" apiVersion="v1" Jan 23 18:30:32.619610 kubelet[2529]: I0123 18:30:32.619253 2529 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 23 18:30:32.619610 kubelet[2529]: W0123 18:30:32.619319 2529 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Jan 23 18:30:32.624369 kubelet[2529]: I0123 18:30:32.624358 2529 watchdog_linux.go:99] "Systemd watchdog is not enabled" Jan 23 18:30:32.624451 kubelet[2529]: I0123 18:30:32.624445 2529 server.go:1287] "Started kubelet" Jan 23 18:30:32.625547 kubelet[2529]: I0123 18:30:32.625536 2529 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 23 18:30:32.627000 audit[2542]: NETFILTER_CFG table=mangle:42 family=2 entries=2 op=nft_register_chain pid=2542 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:32.627000 audit[2542]: SYSCALL arch=c000003e syscall=46 success=yes exit=136 a0=3 a1=7ffc625d8890 a2=0 a3=0 items=0 ppid=2529 pid=2542 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:32.627000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Jan 23 18:30:32.629000 audit[2543]: NETFILTER_CFG table=filter:43 family=2 entries=1 op=nft_register_chain pid=2543 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:32.629000 audit[2543]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fff978a4710 a2=0 a3=0 items=0 ppid=2529 pid=2543 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:32.629000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4649524557414C4C002D740066696C746572 Jan 23 18:30:32.631126 kubelet[2529]: E0123 18:30:32.629807 2529 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.9.101:6443/api/v1/namespaces/default/events\": dial tcp 10.0.9.101:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4547-1-0-2-32611d5cc2.188d6fb088acbed8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4547-1-0-2-32611d5cc2,UID:ci-4547-1-0-2-32611d5cc2,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4547-1-0-2-32611d5cc2,},FirstTimestamp:2026-01-23 18:30:32.624422616 +0000 UTC m=+0.767915228,LastTimestamp:2026-01-23 18:30:32.624422616 +0000 UTC m=+0.767915228,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4547-1-0-2-32611d5cc2,}" Jan 23 18:30:32.632027 kubelet[2529]: I0123 18:30:32.631995 2529 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Jan 23 18:30:32.634452 kubelet[2529]: I0123 18:30:32.633266 2529 server.go:479] "Adding debug handlers to kubelet server" Jan 23 18:30:32.634452 kubelet[2529]: I0123 18:30:32.633480 2529 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 23 18:30:32.634452 kubelet[2529]: I0123 18:30:32.633795 2529 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 23 18:30:32.634452 kubelet[2529]: I0123 18:30:32.634022 2529 volume_manager.go:297] "Starting Kubelet Volume Manager" Jan 23 18:30:32.634452 kubelet[2529]: I0123 18:30:32.634027 2529 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jan 23 18:30:32.634452 kubelet[2529]: E0123 18:30:32.634191 2529 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4547-1-0-2-32611d5cc2\" not found" Jan 23 18:30:32.634000 audit[2545]: NETFILTER_CFG table=filter:44 family=2 entries=2 op=nft_register_chain pid=2545 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:32.634000 audit[2545]: SYSCALL arch=c000003e syscall=46 success=yes exit=340 a0=3 a1=7ffce47a6d90 a2=0 a3=0 items=0 ppid=2529 pid=2545 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:32.634000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 23 18:30:32.637095 kubelet[2529]: E0123 18:30:32.637065 2529 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.9.101:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4547-1-0-2-32611d5cc2?timeout=10s\": dial tcp 10.0.9.101:6443: connect: connection refused" interval="200ms" Jan 23 18:30:32.637393 kubelet[2529]: I0123 18:30:32.637384 2529 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Jan 23 18:30:32.637465 kubelet[2529]: I0123 18:30:32.637460 2529 reconciler.go:26] "Reconciler: start to sync state" Jan 23 18:30:32.638095 kubelet[2529]: E0123 18:30:32.638081 2529 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jan 23 18:30:32.638541 kubelet[2529]: W0123 18:30:32.638513 2529 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.9.101:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.9.101:6443: connect: connection refused Jan 23 18:30:32.638586 kubelet[2529]: E0123 18:30:32.638550 2529 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.0.9.101:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.9.101:6443: connect: connection refused" logger="UnhandledError" Jan 23 18:30:32.638700 kubelet[2529]: I0123 18:30:32.638689 2529 factory.go:221] Registration of the containerd container factory successfully Jan 23 18:30:32.638700 kubelet[2529]: I0123 18:30:32.638700 2529 factory.go:221] Registration of the systemd container factory successfully Jan 23 18:30:32.638764 kubelet[2529]: I0123 18:30:32.638753 2529 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 23 18:30:32.637000 audit[2547]: NETFILTER_CFG table=filter:45 family=2 entries=2 op=nft_register_chain pid=2547 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:32.637000 audit[2547]: SYSCALL arch=c000003e syscall=46 success=yes exit=340 a0=3 a1=7ffd79932730 a2=0 a3=0 items=0 ppid=2529 pid=2547 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:32.637000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 23 18:30:32.644000 audit[2550]: NETFILTER_CFG table=filter:46 family=2 entries=1 op=nft_register_rule pid=2550 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:32.644000 audit[2550]: SYSCALL arch=c000003e syscall=46 success=yes exit=924 a0=3 a1=7ffd21775190 a2=0 a3=0 items=0 ppid=2529 pid=2550 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:32.644000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D41004B5542452D4649524557414C4C002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E7400626C6F636B20696E636F6D696E67206C6F63616C6E657420636F6E6E656374696F6E73002D2D647374003132372E302E302E302F38 Jan 23 18:30:32.646231 kubelet[2529]: I0123 18:30:32.646205 2529 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 23 18:30:32.645000 audit[2551]: NETFILTER_CFG table=mangle:47 family=10 entries=2 op=nft_register_chain pid=2551 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:30:32.645000 audit[2551]: SYSCALL arch=c000003e syscall=46 success=yes exit=136 a0=3 a1=7ffc96ba2a50 a2=0 a3=0 items=0 ppid=2529 pid=2551 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:32.645000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Jan 23 18:30:32.647370 kubelet[2529]: I0123 18:30:32.647357 2529 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 23 18:30:32.647422 kubelet[2529]: I0123 18:30:32.647418 2529 status_manager.go:227] "Starting to sync pod status with apiserver" Jan 23 18:30:32.647470 kubelet[2529]: I0123 18:30:32.647465 2529 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Jan 23 18:30:32.647501 kubelet[2529]: I0123 18:30:32.647498 2529 kubelet.go:2382] "Starting kubelet main sync loop" Jan 23 18:30:32.647571 kubelet[2529]: E0123 18:30:32.647560 2529 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 23 18:30:32.647000 audit[2553]: NETFILTER_CFG table=mangle:48 family=2 entries=1 op=nft_register_chain pid=2553 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:32.647000 audit[2553]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffe18a1b2e0 a2=0 a3=0 items=0 ppid=2529 pid=2553 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:32.647000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Jan 23 18:30:32.648000 audit[2554]: NETFILTER_CFG table=nat:49 family=2 entries=1 op=nft_register_chain pid=2554 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:32.648000 audit[2554]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffe1bd955e0 a2=0 a3=0 items=0 ppid=2529 pid=2554 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:32.648000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Jan 23 18:30:32.649000 audit[2555]: NETFILTER_CFG table=filter:50 family=2 entries=1 op=nft_register_chain pid=2555 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:32.649000 audit[2555]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffc9e6a3a00 a2=0 a3=0 items=0 ppid=2529 pid=2555 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:32.649000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Jan 23 18:30:32.649000 audit[2556]: NETFILTER_CFG table=mangle:51 family=10 entries=1 op=nft_register_chain pid=2556 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:30:32.649000 audit[2556]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffd74f1a880 a2=0 a3=0 items=0 ppid=2529 pid=2556 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:32.649000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Jan 23 18:30:32.650000 audit[2557]: NETFILTER_CFG table=nat:52 family=10 entries=1 op=nft_register_chain pid=2557 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:30:32.650000 audit[2557]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffde6a83ad0 a2=0 a3=0 items=0 ppid=2529 pid=2557 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:32.650000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Jan 23 18:30:32.651000 audit[2558]: NETFILTER_CFG table=filter:53 family=10 entries=1 op=nft_register_chain pid=2558 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:30:32.651000 audit[2558]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffffea8b7f0 a2=0 a3=0 items=0 ppid=2529 pid=2558 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:32.651000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Jan 23 18:30:32.653991 kubelet[2529]: W0123 18:30:32.653950 2529 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.9.101:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.9.101:6443: connect: connection refused Jan 23 18:30:32.654037 kubelet[2529]: E0123 18:30:32.653995 2529 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.0.9.101:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.9.101:6443: connect: connection refused" logger="UnhandledError" Jan 23 18:30:32.665987 kubelet[2529]: I0123 18:30:32.665959 2529 cpu_manager.go:221] "Starting CPU manager" policy="none" Jan 23 18:30:32.666274 kubelet[2529]: I0123 18:30:32.666079 2529 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Jan 23 18:30:32.666274 kubelet[2529]: I0123 18:30:32.666093 2529 state_mem.go:36] "Initialized new in-memory state store" Jan 23 18:30:32.668184 kubelet[2529]: I0123 18:30:32.668029 2529 policy_none.go:49] "None policy: Start" Jan 23 18:30:32.668184 kubelet[2529]: I0123 18:30:32.668044 2529 memory_manager.go:186] "Starting memorymanager" policy="None" Jan 23 18:30:32.668184 kubelet[2529]: I0123 18:30:32.668054 2529 state_mem.go:35] "Initializing new in-memory state store" Jan 23 18:30:32.673391 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Jan 23 18:30:32.687664 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Jan 23 18:30:32.690193 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Jan 23 18:30:32.698280 kubelet[2529]: I0123 18:30:32.697999 2529 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 23 18:30:32.698280 kubelet[2529]: I0123 18:30:32.698151 2529 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 23 18:30:32.698280 kubelet[2529]: I0123 18:30:32.698160 2529 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 23 18:30:32.698692 kubelet[2529]: I0123 18:30:32.698629 2529 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 23 18:30:32.699852 kubelet[2529]: E0123 18:30:32.699825 2529 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Jan 23 18:30:32.699912 kubelet[2529]: E0123 18:30:32.699902 2529 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4547-1-0-2-32611d5cc2\" not found" Jan 23 18:30:32.757845 systemd[1]: Created slice kubepods-burstable-pod52b3e3fd5c6d5319ece2e4ca03b9ebb0.slice - libcontainer container kubepods-burstable-pod52b3e3fd5c6d5319ece2e4ca03b9ebb0.slice. Jan 23 18:30:32.765151 kubelet[2529]: E0123 18:30:32.764971 2529 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-1-0-2-32611d5cc2\" not found" node="ci-4547-1-0-2-32611d5cc2" Jan 23 18:30:32.767785 systemd[1]: Created slice kubepods-burstable-pod3779b9220c588cc794d8ce6b970927e8.slice - libcontainer container kubepods-burstable-pod3779b9220c588cc794d8ce6b970927e8.slice. Jan 23 18:30:32.770373 kubelet[2529]: E0123 18:30:32.770350 2529 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-1-0-2-32611d5cc2\" not found" node="ci-4547-1-0-2-32611d5cc2" Jan 23 18:30:32.772804 systemd[1]: Created slice kubepods-burstable-pod9eef90d1ea79562f40cf400747a3d4c7.slice - libcontainer container kubepods-burstable-pod9eef90d1ea79562f40cf400747a3d4c7.slice. Jan 23 18:30:32.775171 kubelet[2529]: E0123 18:30:32.775001 2529 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-1-0-2-32611d5cc2\" not found" node="ci-4547-1-0-2-32611d5cc2" Jan 23 18:30:32.799876 kubelet[2529]: I0123 18:30:32.799849 2529 kubelet_node_status.go:75] "Attempting to register node" node="ci-4547-1-0-2-32611d5cc2" Jan 23 18:30:32.801415 kubelet[2529]: E0123 18:30:32.800336 2529 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.9.101:6443/api/v1/nodes\": dial tcp 10.0.9.101:6443: connect: connection refused" node="ci-4547-1-0-2-32611d5cc2" Jan 23 18:30:32.838076 kubelet[2529]: I0123 18:30:32.837867 2529 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/9eef90d1ea79562f40cf400747a3d4c7-kubeconfig\") pod \"kube-scheduler-ci-4547-1-0-2-32611d5cc2\" (UID: \"9eef90d1ea79562f40cf400747a3d4c7\") " pod="kube-system/kube-scheduler-ci-4547-1-0-2-32611d5cc2" Jan 23 18:30:32.838076 kubelet[2529]: I0123 18:30:32.837902 2529 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/52b3e3fd5c6d5319ece2e4ca03b9ebb0-k8s-certs\") pod \"kube-apiserver-ci-4547-1-0-2-32611d5cc2\" (UID: \"52b3e3fd5c6d5319ece2e4ca03b9ebb0\") " pod="kube-system/kube-apiserver-ci-4547-1-0-2-32611d5cc2" Jan 23 18:30:32.838076 kubelet[2529]: I0123 18:30:32.837923 2529 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/52b3e3fd5c6d5319ece2e4ca03b9ebb0-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4547-1-0-2-32611d5cc2\" (UID: \"52b3e3fd5c6d5319ece2e4ca03b9ebb0\") " pod="kube-system/kube-apiserver-ci-4547-1-0-2-32611d5cc2" Jan 23 18:30:32.838076 kubelet[2529]: I0123 18:30:32.837941 2529 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/3779b9220c588cc794d8ce6b970927e8-ca-certs\") pod \"kube-controller-manager-ci-4547-1-0-2-32611d5cc2\" (UID: \"3779b9220c588cc794d8ce6b970927e8\") " pod="kube-system/kube-controller-manager-ci-4547-1-0-2-32611d5cc2" Jan 23 18:30:32.838076 kubelet[2529]: I0123 18:30:32.837958 2529 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/3779b9220c588cc794d8ce6b970927e8-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4547-1-0-2-32611d5cc2\" (UID: \"3779b9220c588cc794d8ce6b970927e8\") " pod="kube-system/kube-controller-manager-ci-4547-1-0-2-32611d5cc2" Jan 23 18:30:32.838369 kubelet[2529]: I0123 18:30:32.837971 2529 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/3779b9220c588cc794d8ce6b970927e8-kubeconfig\") pod \"kube-controller-manager-ci-4547-1-0-2-32611d5cc2\" (UID: \"3779b9220c588cc794d8ce6b970927e8\") " pod="kube-system/kube-controller-manager-ci-4547-1-0-2-32611d5cc2" Jan 23 18:30:32.838369 kubelet[2529]: I0123 18:30:32.837983 2529 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/52b3e3fd5c6d5319ece2e4ca03b9ebb0-ca-certs\") pod \"kube-apiserver-ci-4547-1-0-2-32611d5cc2\" (UID: \"52b3e3fd5c6d5319ece2e4ca03b9ebb0\") " pod="kube-system/kube-apiserver-ci-4547-1-0-2-32611d5cc2" Jan 23 18:30:32.838369 kubelet[2529]: I0123 18:30:32.838001 2529 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/3779b9220c588cc794d8ce6b970927e8-flexvolume-dir\") pod \"kube-controller-manager-ci-4547-1-0-2-32611d5cc2\" (UID: \"3779b9220c588cc794d8ce6b970927e8\") " pod="kube-system/kube-controller-manager-ci-4547-1-0-2-32611d5cc2" Jan 23 18:30:32.838369 kubelet[2529]: I0123 18:30:32.838018 2529 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/3779b9220c588cc794d8ce6b970927e8-k8s-certs\") pod \"kube-controller-manager-ci-4547-1-0-2-32611d5cc2\" (UID: \"3779b9220c588cc794d8ce6b970927e8\") " pod="kube-system/kube-controller-manager-ci-4547-1-0-2-32611d5cc2" Jan 23 18:30:32.838369 kubelet[2529]: E0123 18:30:32.838017 2529 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.9.101:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4547-1-0-2-32611d5cc2?timeout=10s\": dial tcp 10.0.9.101:6443: connect: connection refused" interval="400ms" Jan 23 18:30:33.003403 kubelet[2529]: I0123 18:30:33.003352 2529 kubelet_node_status.go:75] "Attempting to register node" node="ci-4547-1-0-2-32611d5cc2" Jan 23 18:30:33.003827 kubelet[2529]: E0123 18:30:33.003800 2529 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.9.101:6443/api/v1/nodes\": dial tcp 10.0.9.101:6443: connect: connection refused" node="ci-4547-1-0-2-32611d5cc2" Jan 23 18:30:33.066405 containerd[1690]: time="2026-01-23T18:30:33.066249225Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4547-1-0-2-32611d5cc2,Uid:52b3e3fd5c6d5319ece2e4ca03b9ebb0,Namespace:kube-system,Attempt:0,}" Jan 23 18:30:33.072233 containerd[1690]: time="2026-01-23T18:30:33.071712165Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4547-1-0-2-32611d5cc2,Uid:3779b9220c588cc794d8ce6b970927e8,Namespace:kube-system,Attempt:0,}" Jan 23 18:30:33.075892 containerd[1690]: time="2026-01-23T18:30:33.075868131Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4547-1-0-2-32611d5cc2,Uid:9eef90d1ea79562f40cf400747a3d4c7,Namespace:kube-system,Attempt:0,}" Jan 23 18:30:33.104653 containerd[1690]: time="2026-01-23T18:30:33.103600608Z" level=info msg="connecting to shim 1cea0db0fb229045f0454dc8acb778a90b612971fe6baa20549b74c1177f5ef0" address="unix:///run/containerd/s/44a438c30d5ae7093a3463589f234cc0dbf4d41ac9700b2514fbf87897239507" namespace=k8s.io protocol=ttrpc version=3 Jan 23 18:30:33.114501 containerd[1690]: time="2026-01-23T18:30:33.114456142Z" level=info msg="connecting to shim 14feddcfa7e689a7a13361fe8be75a3160bae45bf60692314a983ebdba326143" address="unix:///run/containerd/s/6b12dc3141138823df98416ef9a8ddbad15a648507919503ca0bfa6903a13124" namespace=k8s.io protocol=ttrpc version=3 Jan 23 18:30:33.119942 containerd[1690]: time="2026-01-23T18:30:33.119912501Z" level=info msg="connecting to shim d390d3daf79ae4c427b44f6e322e213d343ddd3650025cbc1967d0ad3789a89f" address="unix:///run/containerd/s/670e9b1f02ae640eb538a016c40a3b185f4d90182f3951b43f5d3c7524940110" namespace=k8s.io protocol=ttrpc version=3 Jan 23 18:30:33.140452 systemd[1]: Started cri-containerd-1cea0db0fb229045f0454dc8acb778a90b612971fe6baa20549b74c1177f5ef0.scope - libcontainer container 1cea0db0fb229045f0454dc8acb778a90b612971fe6baa20549b74c1177f5ef0. Jan 23 18:30:33.155522 systemd[1]: Started cri-containerd-d390d3daf79ae4c427b44f6e322e213d343ddd3650025cbc1967d0ad3789a89f.scope - libcontainer container d390d3daf79ae4c427b44f6e322e213d343ddd3650025cbc1967d0ad3789a89f. Jan 23 18:30:33.157000 audit: BPF prog-id=83 op=LOAD Jan 23 18:30:33.160806 systemd[1]: Started cri-containerd-14feddcfa7e689a7a13361fe8be75a3160bae45bf60692314a983ebdba326143.scope - libcontainer container 14feddcfa7e689a7a13361fe8be75a3160bae45bf60692314a983ebdba326143. Jan 23 18:30:33.160000 audit: BPF prog-id=84 op=LOAD Jan 23 18:30:33.160000 audit[2598]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000186238 a2=98 a3=0 items=0 ppid=2570 pid=2598 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:33.160000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3163656130646230666232323930343566303435346463386163623737 Jan 23 18:30:33.160000 audit: BPF prog-id=84 op=UNLOAD Jan 23 18:30:33.160000 audit[2598]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2570 pid=2598 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:33.160000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3163656130646230666232323930343566303435346463386163623737 Jan 23 18:30:33.160000 audit: BPF prog-id=85 op=LOAD Jan 23 18:30:33.160000 audit[2598]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000186488 a2=98 a3=0 items=0 ppid=2570 pid=2598 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:33.160000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3163656130646230666232323930343566303435346463386163623737 Jan 23 18:30:33.160000 audit: BPF prog-id=86 op=LOAD Jan 23 18:30:33.160000 audit[2598]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000186218 a2=98 a3=0 items=0 ppid=2570 pid=2598 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:33.160000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3163656130646230666232323930343566303435346463386163623737 Jan 23 18:30:33.160000 audit: BPF prog-id=86 op=UNLOAD Jan 23 18:30:33.160000 audit[2598]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2570 pid=2598 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:33.160000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3163656130646230666232323930343566303435346463386163623737 Jan 23 18:30:33.161000 audit: BPF prog-id=85 op=UNLOAD Jan 23 18:30:33.161000 audit[2598]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2570 pid=2598 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:33.161000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3163656130646230666232323930343566303435346463386163623737 Jan 23 18:30:33.161000 audit: BPF prog-id=87 op=LOAD Jan 23 18:30:33.161000 audit[2598]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001866e8 a2=98 a3=0 items=0 ppid=2570 pid=2598 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:33.161000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3163656130646230666232323930343566303435346463386163623737 Jan 23 18:30:33.170000 audit: BPF prog-id=88 op=LOAD Jan 23 18:30:33.170000 audit: BPF prog-id=89 op=LOAD Jan 23 18:30:33.170000 audit[2638]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000228238 a2=98 a3=0 items=0 ppid=2596 pid=2638 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:33.170000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3134666564646366613765363839613761313333363166653862653735 Jan 23 18:30:33.170000 audit: BPF prog-id=89 op=UNLOAD Jan 23 18:30:33.170000 audit[2638]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2596 pid=2638 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:33.170000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3134666564646366613765363839613761313333363166653862653735 Jan 23 18:30:33.171000 audit: BPF prog-id=90 op=LOAD Jan 23 18:30:33.171000 audit[2638]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000228488 a2=98 a3=0 items=0 ppid=2596 pid=2638 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:33.171000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3134666564646366613765363839613761313333363166653862653735 Jan 23 18:30:33.171000 audit: BPF prog-id=91 op=LOAD Jan 23 18:30:33.171000 audit[2638]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000228218 a2=98 a3=0 items=0 ppid=2596 pid=2638 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:33.171000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3134666564646366613765363839613761313333363166653862653735 Jan 23 18:30:33.171000 audit: BPF prog-id=91 op=UNLOAD Jan 23 18:30:33.171000 audit[2638]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=2596 pid=2638 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:33.171000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3134666564646366613765363839613761313333363166653862653735 Jan 23 18:30:33.171000 audit: BPF prog-id=90 op=UNLOAD Jan 23 18:30:33.171000 audit[2638]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2596 pid=2638 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:33.171000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3134666564646366613765363839613761313333363166653862653735 Jan 23 18:30:33.171000 audit: BPF prog-id=92 op=LOAD Jan 23 18:30:33.171000 audit[2638]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0002286e8 a2=98 a3=0 items=0 ppid=2596 pid=2638 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:33.171000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3134666564646366613765363839613761313333363166653862653735 Jan 23 18:30:33.181000 audit: BPF prog-id=93 op=LOAD Jan 23 18:30:33.182000 audit: BPF prog-id=94 op=LOAD Jan 23 18:30:33.182000 audit[2632]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000106238 a2=98 a3=0 items=0 ppid=2600 pid=2632 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:33.182000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6433393064336461663739616534633432376234346636653332326532 Jan 23 18:30:33.182000 audit: BPF prog-id=94 op=UNLOAD Jan 23 18:30:33.182000 audit[2632]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2600 pid=2632 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:33.182000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6433393064336461663739616534633432376234346636653332326532 Jan 23 18:30:33.182000 audit: BPF prog-id=95 op=LOAD Jan 23 18:30:33.182000 audit[2632]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000106488 a2=98 a3=0 items=0 ppid=2600 pid=2632 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:33.182000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6433393064336461663739616534633432376234346636653332326532 Jan 23 18:30:33.184000 audit: BPF prog-id=96 op=LOAD Jan 23 18:30:33.184000 audit[2632]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000106218 a2=98 a3=0 items=0 ppid=2600 pid=2632 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:33.184000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6433393064336461663739616534633432376234346636653332326532 Jan 23 18:30:33.184000 audit: BPF prog-id=96 op=UNLOAD Jan 23 18:30:33.184000 audit[2632]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2600 pid=2632 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:33.184000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6433393064336461663739616534633432376234346636653332326532 Jan 23 18:30:33.184000 audit: BPF prog-id=95 op=UNLOAD Jan 23 18:30:33.184000 audit[2632]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2600 pid=2632 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:33.184000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6433393064336461663739616534633432376234346636653332326532 Jan 23 18:30:33.184000 audit: BPF prog-id=97 op=LOAD Jan 23 18:30:33.184000 audit[2632]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001066e8 a2=98 a3=0 items=0 ppid=2600 pid=2632 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:33.184000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6433393064336461663739616534633432376234346636653332326532 Jan 23 18:30:33.213550 containerd[1690]: time="2026-01-23T18:30:33.213497718Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4547-1-0-2-32611d5cc2,Uid:9eef90d1ea79562f40cf400747a3d4c7,Namespace:kube-system,Attempt:0,} returns sandbox id \"14feddcfa7e689a7a13361fe8be75a3160bae45bf60692314a983ebdba326143\"" Jan 23 18:30:33.214156 containerd[1690]: time="2026-01-23T18:30:33.214133701Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4547-1-0-2-32611d5cc2,Uid:52b3e3fd5c6d5319ece2e4ca03b9ebb0,Namespace:kube-system,Attempt:0,} returns sandbox id \"1cea0db0fb229045f0454dc8acb778a90b612971fe6baa20549b74c1177f5ef0\"" Jan 23 18:30:33.217642 containerd[1690]: time="2026-01-23T18:30:33.217569850Z" level=info msg="CreateContainer within sandbox \"14feddcfa7e689a7a13361fe8be75a3160bae45bf60692314a983ebdba326143\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Jan 23 18:30:33.217970 containerd[1690]: time="2026-01-23T18:30:33.217954110Z" level=info msg="CreateContainer within sandbox \"1cea0db0fb229045f0454dc8acb778a90b612971fe6baa20549b74c1177f5ef0\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Jan 23 18:30:33.233689 containerd[1690]: time="2026-01-23T18:30:33.233654663Z" level=info msg="Container dd87d330c68e8a3cc2bc49d93f89d19f818bba102203298806d51e248a22ce45: CDI devices from CRI Config.CDIDevices: []" Jan 23 18:30:33.234273 containerd[1690]: time="2026-01-23T18:30:33.234180336Z" level=info msg="Container 007c939f4b195245bfeab7ac774a0618fd44f6a360b6921febaf831029123c01: CDI devices from CRI Config.CDIDevices: []" Jan 23 18:30:33.239741 containerd[1690]: time="2026-01-23T18:30:33.239610456Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4547-1-0-2-32611d5cc2,Uid:3779b9220c588cc794d8ce6b970927e8,Namespace:kube-system,Attempt:0,} returns sandbox id \"d390d3daf79ae4c427b44f6e322e213d343ddd3650025cbc1967d0ad3789a89f\"" Jan 23 18:30:33.239808 kubelet[2529]: E0123 18:30:33.239675 2529 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.9.101:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4547-1-0-2-32611d5cc2?timeout=10s\": dial tcp 10.0.9.101:6443: connect: connection refused" interval="800ms" Jan 23 18:30:33.240247 containerd[1690]: time="2026-01-23T18:30:33.240218692Z" level=info msg="CreateContainer within sandbox \"14feddcfa7e689a7a13361fe8be75a3160bae45bf60692314a983ebdba326143\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"dd87d330c68e8a3cc2bc49d93f89d19f818bba102203298806d51e248a22ce45\"" Jan 23 18:30:33.240757 containerd[1690]: time="2026-01-23T18:30:33.240743458Z" level=info msg="StartContainer for \"dd87d330c68e8a3cc2bc49d93f89d19f818bba102203298806d51e248a22ce45\"" Jan 23 18:30:33.242495 containerd[1690]: time="2026-01-23T18:30:33.242441481Z" level=info msg="connecting to shim dd87d330c68e8a3cc2bc49d93f89d19f818bba102203298806d51e248a22ce45" address="unix:///run/containerd/s/6b12dc3141138823df98416ef9a8ddbad15a648507919503ca0bfa6903a13124" protocol=ttrpc version=3 Jan 23 18:30:33.243188 containerd[1690]: time="2026-01-23T18:30:33.242888530Z" level=info msg="CreateContainer within sandbox \"d390d3daf79ae4c427b44f6e322e213d343ddd3650025cbc1967d0ad3789a89f\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Jan 23 18:30:33.243923 containerd[1690]: time="2026-01-23T18:30:33.243907285Z" level=info msg="CreateContainer within sandbox \"1cea0db0fb229045f0454dc8acb778a90b612971fe6baa20549b74c1177f5ef0\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"007c939f4b195245bfeab7ac774a0618fd44f6a360b6921febaf831029123c01\"" Jan 23 18:30:33.245271 containerd[1690]: time="2026-01-23T18:30:33.245246311Z" level=info msg="StartContainer for \"007c939f4b195245bfeab7ac774a0618fd44f6a360b6921febaf831029123c01\"" Jan 23 18:30:33.246366 containerd[1690]: time="2026-01-23T18:30:33.246339238Z" level=info msg="connecting to shim 007c939f4b195245bfeab7ac774a0618fd44f6a360b6921febaf831029123c01" address="unix:///run/containerd/s/44a438c30d5ae7093a3463589f234cc0dbf4d41ac9700b2514fbf87897239507" protocol=ttrpc version=3 Jan 23 18:30:33.249917 containerd[1690]: time="2026-01-23T18:30:33.249900684Z" level=info msg="Container 24bb17e21eae28d5ce491e01f7f9a24ca937cd79552e2142db76a4ec55565440: CDI devices from CRI Config.CDIDevices: []" Jan 23 18:30:33.259917 containerd[1690]: time="2026-01-23T18:30:33.259891346Z" level=info msg="CreateContainer within sandbox \"d390d3daf79ae4c427b44f6e322e213d343ddd3650025cbc1967d0ad3789a89f\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"24bb17e21eae28d5ce491e01f7f9a24ca937cd79552e2142db76a4ec55565440\"" Jan 23 18:30:33.260342 containerd[1690]: time="2026-01-23T18:30:33.260329630Z" level=info msg="StartContainer for \"24bb17e21eae28d5ce491e01f7f9a24ca937cd79552e2142db76a4ec55565440\"" Jan 23 18:30:33.261310 containerd[1690]: time="2026-01-23T18:30:33.261220942Z" level=info msg="connecting to shim 24bb17e21eae28d5ce491e01f7f9a24ca937cd79552e2142db76a4ec55565440" address="unix:///run/containerd/s/670e9b1f02ae640eb538a016c40a3b185f4d90182f3951b43f5d3c7524940110" protocol=ttrpc version=3 Jan 23 18:30:33.262548 systemd[1]: Started cri-containerd-dd87d330c68e8a3cc2bc49d93f89d19f818bba102203298806d51e248a22ce45.scope - libcontainer container dd87d330c68e8a3cc2bc49d93f89d19f818bba102203298806d51e248a22ce45. Jan 23 18:30:33.271443 systemd[1]: Started cri-containerd-007c939f4b195245bfeab7ac774a0618fd44f6a360b6921febaf831029123c01.scope - libcontainer container 007c939f4b195245bfeab7ac774a0618fd44f6a360b6921febaf831029123c01. Jan 23 18:30:33.278000 audit: BPF prog-id=98 op=LOAD Jan 23 18:30:33.279000 audit: BPF prog-id=99 op=LOAD Jan 23 18:30:33.279000 audit[2699]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a8238 a2=98 a3=0 items=0 ppid=2596 pid=2699 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:33.279000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6464383764333330633638653861336363326263343964393366383964 Jan 23 18:30:33.279000 audit: BPF prog-id=99 op=UNLOAD Jan 23 18:30:33.279000 audit[2699]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2596 pid=2699 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:33.279000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6464383764333330633638653861336363326263343964393366383964 Jan 23 18:30:33.279000 audit: BPF prog-id=100 op=LOAD Jan 23 18:30:33.279000 audit[2699]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a8488 a2=98 a3=0 items=0 ppid=2596 pid=2699 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:33.279000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6464383764333330633638653861336363326263343964393366383964 Jan 23 18:30:33.279000 audit: BPF prog-id=101 op=LOAD Jan 23 18:30:33.279000 audit[2699]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c0001a8218 a2=98 a3=0 items=0 ppid=2596 pid=2699 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:33.279000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6464383764333330633638653861336363326263343964393366383964 Jan 23 18:30:33.279000 audit: BPF prog-id=101 op=UNLOAD Jan 23 18:30:33.279000 audit[2699]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2596 pid=2699 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:33.279000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6464383764333330633638653861336363326263343964393366383964 Jan 23 18:30:33.279000 audit: BPF prog-id=100 op=UNLOAD Jan 23 18:30:33.279000 audit[2699]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2596 pid=2699 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:33.279000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6464383764333330633638653861336363326263343964393366383964 Jan 23 18:30:33.279000 audit: BPF prog-id=102 op=LOAD Jan 23 18:30:33.279000 audit[2699]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a86e8 a2=98 a3=0 items=0 ppid=2596 pid=2699 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:33.279000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6464383764333330633638653861336363326263343964393366383964 Jan 23 18:30:33.287669 systemd[1]: Started cri-containerd-24bb17e21eae28d5ce491e01f7f9a24ca937cd79552e2142db76a4ec55565440.scope - libcontainer container 24bb17e21eae28d5ce491e01f7f9a24ca937cd79552e2142db76a4ec55565440. Jan 23 18:30:33.291000 audit: BPF prog-id=103 op=LOAD Jan 23 18:30:33.292000 audit: BPF prog-id=104 op=LOAD Jan 23 18:30:33.292000 audit[2707]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=2570 pid=2707 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:33.292000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3030376339333966346231393532343562666561623761633737346130 Jan 23 18:30:33.292000 audit: BPF prog-id=104 op=UNLOAD Jan 23 18:30:33.292000 audit[2707]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2570 pid=2707 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:33.292000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3030376339333966346231393532343562666561623761633737346130 Jan 23 18:30:33.292000 audit: BPF prog-id=105 op=LOAD Jan 23 18:30:33.292000 audit[2707]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=2570 pid=2707 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:33.292000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3030376339333966346231393532343562666561623761633737346130 Jan 23 18:30:33.292000 audit: BPF prog-id=106 op=LOAD Jan 23 18:30:33.292000 audit[2707]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=2570 pid=2707 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:33.292000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3030376339333966346231393532343562666561623761633737346130 Jan 23 18:30:33.292000 audit: BPF prog-id=106 op=UNLOAD Jan 23 18:30:33.292000 audit[2707]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2570 pid=2707 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:33.292000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3030376339333966346231393532343562666561623761633737346130 Jan 23 18:30:33.292000 audit: BPF prog-id=105 op=UNLOAD Jan 23 18:30:33.292000 audit[2707]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2570 pid=2707 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:33.292000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3030376339333966346231393532343562666561623761633737346130 Jan 23 18:30:33.292000 audit: BPF prog-id=107 op=LOAD Jan 23 18:30:33.292000 audit[2707]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=2570 pid=2707 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:33.292000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3030376339333966346231393532343562666561623761633737346130 Jan 23 18:30:33.304000 audit: BPF prog-id=108 op=LOAD Jan 23 18:30:33.305000 audit: BPF prog-id=109 op=LOAD Jan 23 18:30:33.305000 audit[2722]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a8238 a2=98 a3=0 items=0 ppid=2600 pid=2722 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:33.305000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3234626231376532316561653238643563653439316530316637663961 Jan 23 18:30:33.305000 audit: BPF prog-id=109 op=UNLOAD Jan 23 18:30:33.305000 audit[2722]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2600 pid=2722 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:33.305000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3234626231376532316561653238643563653439316530316637663961 Jan 23 18:30:33.305000 audit: BPF prog-id=110 op=LOAD Jan 23 18:30:33.305000 audit[2722]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a8488 a2=98 a3=0 items=0 ppid=2600 pid=2722 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:33.305000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3234626231376532316561653238643563653439316530316637663961 Jan 23 18:30:33.305000 audit: BPF prog-id=111 op=LOAD Jan 23 18:30:33.305000 audit[2722]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c0001a8218 a2=98 a3=0 items=0 ppid=2600 pid=2722 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:33.305000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3234626231376532316561653238643563653439316530316637663961 Jan 23 18:30:33.305000 audit: BPF prog-id=111 op=UNLOAD Jan 23 18:30:33.305000 audit[2722]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2600 pid=2722 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:33.305000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3234626231376532316561653238643563653439316530316637663961 Jan 23 18:30:33.305000 audit: BPF prog-id=110 op=UNLOAD Jan 23 18:30:33.305000 audit[2722]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2600 pid=2722 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:33.305000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3234626231376532316561653238643563653439316530316637663961 Jan 23 18:30:33.305000 audit: BPF prog-id=112 op=LOAD Jan 23 18:30:33.305000 audit[2722]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a86e8 a2=98 a3=0 items=0 ppid=2600 pid=2722 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:33.305000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3234626231376532316561653238643563653439316530316637663961 Jan 23 18:30:33.371959 containerd[1690]: time="2026-01-23T18:30:33.371416434Z" level=info msg="StartContainer for \"dd87d330c68e8a3cc2bc49d93f89d19f818bba102203298806d51e248a22ce45\" returns successfully" Jan 23 18:30:33.377699 containerd[1690]: time="2026-01-23T18:30:33.377673308Z" level=info msg="StartContainer for \"007c939f4b195245bfeab7ac774a0618fd44f6a360b6921febaf831029123c01\" returns successfully" Jan 23 18:30:33.378367 containerd[1690]: time="2026-01-23T18:30:33.378347841Z" level=info msg="StartContainer for \"24bb17e21eae28d5ce491e01f7f9a24ca937cd79552e2142db76a4ec55565440\" returns successfully" Jan 23 18:30:33.406554 kubelet[2529]: I0123 18:30:33.406527 2529 kubelet_node_status.go:75] "Attempting to register node" node="ci-4547-1-0-2-32611d5cc2" Jan 23 18:30:33.406807 kubelet[2529]: E0123 18:30:33.406785 2529 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.9.101:6443/api/v1/nodes\": dial tcp 10.0.9.101:6443: connect: connection refused" node="ci-4547-1-0-2-32611d5cc2" Jan 23 18:30:33.670737 kubelet[2529]: E0123 18:30:33.670507 2529 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-1-0-2-32611d5cc2\" not found" node="ci-4547-1-0-2-32611d5cc2" Jan 23 18:30:33.671120 kubelet[2529]: E0123 18:30:33.671023 2529 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-1-0-2-32611d5cc2\" not found" node="ci-4547-1-0-2-32611d5cc2" Jan 23 18:30:33.673583 kubelet[2529]: E0123 18:30:33.673566 2529 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-1-0-2-32611d5cc2\" not found" node="ci-4547-1-0-2-32611d5cc2" Jan 23 18:30:34.209423 kubelet[2529]: I0123 18:30:34.209401 2529 kubelet_node_status.go:75] "Attempting to register node" node="ci-4547-1-0-2-32611d5cc2" Jan 23 18:30:34.662766 kubelet[2529]: E0123 18:30:34.662736 2529 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4547-1-0-2-32611d5cc2\" not found" node="ci-4547-1-0-2-32611d5cc2" Jan 23 18:30:34.677581 kubelet[2529]: E0123 18:30:34.677560 2529 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-1-0-2-32611d5cc2\" not found" node="ci-4547-1-0-2-32611d5cc2" Jan 23 18:30:34.678622 kubelet[2529]: E0123 18:30:34.678610 2529 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-1-0-2-32611d5cc2\" not found" node="ci-4547-1-0-2-32611d5cc2" Jan 23 18:30:34.819697 kubelet[2529]: I0123 18:30:34.819663 2529 kubelet_node_status.go:78] "Successfully registered node" node="ci-4547-1-0-2-32611d5cc2" Jan 23 18:30:34.834761 kubelet[2529]: I0123 18:30:34.834739 2529 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4547-1-0-2-32611d5cc2" Jan 23 18:30:34.839826 kubelet[2529]: E0123 18:30:34.839803 2529 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4547-1-0-2-32611d5cc2\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4547-1-0-2-32611d5cc2" Jan 23 18:30:34.839826 kubelet[2529]: I0123 18:30:34.839824 2529 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4547-1-0-2-32611d5cc2" Jan 23 18:30:34.842973 kubelet[2529]: E0123 18:30:34.842955 2529 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4547-1-0-2-32611d5cc2\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4547-1-0-2-32611d5cc2" Jan 23 18:30:34.842973 kubelet[2529]: I0123 18:30:34.842972 2529 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4547-1-0-2-32611d5cc2" Jan 23 18:30:34.845836 kubelet[2529]: E0123 18:30:34.845818 2529 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4547-1-0-2-32611d5cc2\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4547-1-0-2-32611d5cc2" Jan 23 18:30:35.615555 kubelet[2529]: I0123 18:30:35.615520 2529 apiserver.go:52] "Watching apiserver" Jan 23 18:30:35.637752 kubelet[2529]: I0123 18:30:35.637700 2529 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Jan 23 18:30:35.970224 kubelet[2529]: I0123 18:30:35.970021 2529 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4547-1-0-2-32611d5cc2" Jan 23 18:30:36.810357 systemd[1]: Reload requested from client PID 2797 ('systemctl') (unit session-10.scope)... Jan 23 18:30:36.810373 systemd[1]: Reloading... Jan 23 18:30:36.901085 zram_generator::config[2839]: No configuration found. Jan 23 18:30:37.092446 systemd[1]: Reloading finished in 281 ms. Jan 23 18:30:37.110121 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Jan 23 18:30:37.123017 systemd[1]: kubelet.service: Deactivated successfully. Jan 23 18:30:37.123430 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 23 18:30:37.122000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:37.123616 systemd[1]: kubelet.service: Consumed 1.032s CPU time, 128.9M memory peak. Jan 23 18:30:37.124541 kernel: kauditd_printk_skb: 204 callbacks suppressed Jan 23 18:30:37.124597 kernel: audit: type=1131 audit(1769193037.122:400): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:37.127780 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 23 18:30:37.127000 audit: BPF prog-id=113 op=LOAD Jan 23 18:30:37.127000 audit: BPF prog-id=68 op=UNLOAD Jan 23 18:30:37.132596 kernel: audit: type=1334 audit(1769193037.127:401): prog-id=113 op=LOAD Jan 23 18:30:37.132637 kernel: audit: type=1334 audit(1769193037.127:402): prog-id=68 op=UNLOAD Jan 23 18:30:37.127000 audit: BPF prog-id=114 op=LOAD Jan 23 18:30:37.134528 kernel: audit: type=1334 audit(1769193037.127:403): prog-id=114 op=LOAD Jan 23 18:30:37.127000 audit: BPF prog-id=115 op=LOAD Jan 23 18:30:37.135559 kernel: audit: type=1334 audit(1769193037.127:404): prog-id=115 op=LOAD Jan 23 18:30:37.135599 kernel: audit: type=1334 audit(1769193037.127:405): prog-id=69 op=UNLOAD Jan 23 18:30:37.127000 audit: BPF prog-id=69 op=UNLOAD Jan 23 18:30:37.127000 audit: BPF prog-id=70 op=UNLOAD Jan 23 18:30:37.140222 kernel: audit: type=1334 audit(1769193037.127:406): prog-id=70 op=UNLOAD Jan 23 18:30:37.140302 kernel: audit: type=1334 audit(1769193037.128:407): prog-id=116 op=LOAD Jan 23 18:30:37.140321 kernel: audit: type=1334 audit(1769193037.128:408): prog-id=74 op=UNLOAD Jan 23 18:30:37.128000 audit: BPF prog-id=116 op=LOAD Jan 23 18:30:37.128000 audit: BPF prog-id=74 op=UNLOAD Jan 23 18:30:37.129000 audit: BPF prog-id=117 op=LOAD Jan 23 18:30:37.141550 kernel: audit: type=1334 audit(1769193037.129:409): prog-id=117 op=LOAD Jan 23 18:30:37.130000 audit: BPF prog-id=65 op=UNLOAD Jan 23 18:30:37.130000 audit: BPF prog-id=118 op=LOAD Jan 23 18:30:37.130000 audit: BPF prog-id=119 op=LOAD Jan 23 18:30:37.130000 audit: BPF prog-id=66 op=UNLOAD Jan 23 18:30:37.130000 audit: BPF prog-id=67 op=UNLOAD Jan 23 18:30:37.132000 audit: BPF prog-id=120 op=LOAD Jan 23 18:30:37.132000 audit: BPF prog-id=79 op=UNLOAD Jan 23 18:30:37.132000 audit: BPF prog-id=121 op=LOAD Jan 23 18:30:37.132000 audit: BPF prog-id=122 op=LOAD Jan 23 18:30:37.132000 audit: BPF prog-id=80 op=UNLOAD Jan 23 18:30:37.132000 audit: BPF prog-id=81 op=UNLOAD Jan 23 18:30:37.137000 audit: BPF prog-id=123 op=LOAD Jan 23 18:30:37.137000 audit: BPF prog-id=71 op=UNLOAD Jan 23 18:30:37.137000 audit: BPF prog-id=124 op=LOAD Jan 23 18:30:37.137000 audit: BPF prog-id=125 op=LOAD Jan 23 18:30:37.137000 audit: BPF prog-id=72 op=UNLOAD Jan 23 18:30:37.137000 audit: BPF prog-id=73 op=UNLOAD Jan 23 18:30:37.137000 audit: BPF prog-id=126 op=LOAD Jan 23 18:30:37.137000 audit: BPF prog-id=76 op=UNLOAD Jan 23 18:30:37.137000 audit: BPF prog-id=127 op=LOAD Jan 23 18:30:37.137000 audit: BPF prog-id=128 op=LOAD Jan 23 18:30:37.137000 audit: BPF prog-id=77 op=UNLOAD Jan 23 18:30:37.137000 audit: BPF prog-id=78 op=UNLOAD Jan 23 18:30:37.140000 audit: BPF prog-id=129 op=LOAD Jan 23 18:30:37.140000 audit: BPF prog-id=82 op=UNLOAD Jan 23 18:30:37.142000 audit: BPF prog-id=130 op=LOAD Jan 23 18:30:37.142000 audit: BPF prog-id=75 op=UNLOAD Jan 23 18:30:37.142000 audit: BPF prog-id=131 op=LOAD Jan 23 18:30:37.142000 audit: BPF prog-id=132 op=LOAD Jan 23 18:30:37.142000 audit: BPF prog-id=63 op=UNLOAD Jan 23 18:30:37.142000 audit: BPF prog-id=64 op=UNLOAD Jan 23 18:30:37.279901 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 23 18:30:37.278000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:37.288675 (kubelet)[2894]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 23 18:30:37.329277 kubelet[2894]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 23 18:30:37.329277 kubelet[2894]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Jan 23 18:30:37.329277 kubelet[2894]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 23 18:30:37.329277 kubelet[2894]: I0123 18:30:37.328597 2894 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 23 18:30:37.335482 kubelet[2894]: I0123 18:30:37.335447 2894 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Jan 23 18:30:37.335482 kubelet[2894]: I0123 18:30:37.335473 2894 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 23 18:30:37.335741 kubelet[2894]: I0123 18:30:37.335724 2894 server.go:954] "Client rotation is on, will bootstrap in background" Jan 23 18:30:37.336916 kubelet[2894]: I0123 18:30:37.336893 2894 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Jan 23 18:30:37.341158 kubelet[2894]: I0123 18:30:37.340751 2894 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 23 18:30:37.345958 kubelet[2894]: I0123 18:30:37.345235 2894 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 23 18:30:37.348399 kubelet[2894]: I0123 18:30:37.348380 2894 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jan 23 18:30:37.348640 kubelet[2894]: I0123 18:30:37.348604 2894 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 23 18:30:37.348829 kubelet[2894]: I0123 18:30:37.348633 2894 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4547-1-0-2-32611d5cc2","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 23 18:30:37.348908 kubelet[2894]: I0123 18:30:37.348838 2894 topology_manager.go:138] "Creating topology manager with none policy" Jan 23 18:30:37.348908 kubelet[2894]: I0123 18:30:37.348855 2894 container_manager_linux.go:304] "Creating device plugin manager" Jan 23 18:30:37.348908 kubelet[2894]: I0123 18:30:37.348907 2894 state_mem.go:36] "Initialized new in-memory state store" Jan 23 18:30:37.349504 kubelet[2894]: I0123 18:30:37.349491 2894 kubelet.go:446] "Attempting to sync node with API server" Jan 23 18:30:37.349530 kubelet[2894]: I0123 18:30:37.349514 2894 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 23 18:30:37.349557 kubelet[2894]: I0123 18:30:37.349535 2894 kubelet.go:352] "Adding apiserver pod source" Jan 23 18:30:37.349557 kubelet[2894]: I0123 18:30:37.349545 2894 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 23 18:30:37.352334 kubelet[2894]: I0123 18:30:37.351114 2894 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.1.5" apiVersion="v1" Jan 23 18:30:37.352334 kubelet[2894]: I0123 18:30:37.351680 2894 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 23 18:30:37.355152 kubelet[2894]: I0123 18:30:37.355139 2894 watchdog_linux.go:99] "Systemd watchdog is not enabled" Jan 23 18:30:37.355218 kubelet[2894]: I0123 18:30:37.355213 2894 server.go:1287] "Started kubelet" Jan 23 18:30:37.356996 kubelet[2894]: I0123 18:30:37.356983 2894 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 23 18:30:37.363618 kubelet[2894]: I0123 18:30:37.363606 2894 volume_manager.go:297] "Starting Kubelet Volume Manager" Jan 23 18:30:37.363754 kubelet[2894]: I0123 18:30:37.363737 2894 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Jan 23 18:30:37.364578 kubelet[2894]: I0123 18:30:37.364568 2894 server.go:479] "Adding debug handlers to kubelet server" Jan 23 18:30:37.365368 kubelet[2894]: I0123 18:30:37.365337 2894 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 23 18:30:37.365538 kubelet[2894]: I0123 18:30:37.365530 2894 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 23 18:30:37.365889 kubelet[2894]: I0123 18:30:37.365876 2894 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jan 23 18:30:37.367412 kubelet[2894]: E0123 18:30:37.367396 2894 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4547-1-0-2-32611d5cc2\" not found" Jan 23 18:30:37.370133 kubelet[2894]: I0123 18:30:37.370076 2894 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Jan 23 18:30:37.370302 kubelet[2894]: I0123 18:30:37.370282 2894 reconciler.go:26] "Reconciler: start to sync state" Jan 23 18:30:37.374518 kubelet[2894]: I0123 18:30:37.374499 2894 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 23 18:30:37.375456 kubelet[2894]: I0123 18:30:37.375443 2894 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 23 18:30:37.375521 kubelet[2894]: I0123 18:30:37.375515 2894 status_manager.go:227] "Starting to sync pod status with apiserver" Jan 23 18:30:37.375573 kubelet[2894]: I0123 18:30:37.375566 2894 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Jan 23 18:30:37.375637 kubelet[2894]: I0123 18:30:37.375632 2894 kubelet.go:2382] "Starting kubelet main sync loop" Jan 23 18:30:37.375746 kubelet[2894]: E0123 18:30:37.375693 2894 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 23 18:30:37.378343 kubelet[2894]: I0123 18:30:37.378323 2894 factory.go:221] Registration of the systemd container factory successfully Jan 23 18:30:37.378426 kubelet[2894]: I0123 18:30:37.378408 2894 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 23 18:30:37.382672 kubelet[2894]: I0123 18:30:37.382658 2894 factory.go:221] Registration of the containerd container factory successfully Jan 23 18:30:37.383060 kubelet[2894]: E0123 18:30:37.382691 2894 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jan 23 18:30:37.430018 kubelet[2894]: I0123 18:30:37.429991 2894 cpu_manager.go:221] "Starting CPU manager" policy="none" Jan 23 18:30:37.430206 kubelet[2894]: I0123 18:30:37.430191 2894 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Jan 23 18:30:37.430281 kubelet[2894]: I0123 18:30:37.430275 2894 state_mem.go:36] "Initialized new in-memory state store" Jan 23 18:30:37.430461 kubelet[2894]: I0123 18:30:37.430452 2894 state_mem.go:88] "Updated default CPUSet" cpuSet="" Jan 23 18:30:37.430515 kubelet[2894]: I0123 18:30:37.430499 2894 state_mem.go:96] "Updated CPUSet assignments" assignments={} Jan 23 18:30:37.430557 kubelet[2894]: I0123 18:30:37.430552 2894 policy_none.go:49] "None policy: Start" Jan 23 18:30:37.430593 kubelet[2894]: I0123 18:30:37.430589 2894 memory_manager.go:186] "Starting memorymanager" policy="None" Jan 23 18:30:37.430627 kubelet[2894]: I0123 18:30:37.430623 2894 state_mem.go:35] "Initializing new in-memory state store" Jan 23 18:30:37.430782 kubelet[2894]: I0123 18:30:37.430775 2894 state_mem.go:75] "Updated machine memory state" Jan 23 18:30:37.434305 kubelet[2894]: I0123 18:30:37.434287 2894 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 23 18:30:37.434543 kubelet[2894]: I0123 18:30:37.434535 2894 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 23 18:30:37.434613 kubelet[2894]: I0123 18:30:37.434589 2894 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 23 18:30:37.434980 kubelet[2894]: I0123 18:30:37.434968 2894 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 23 18:30:37.437858 kubelet[2894]: E0123 18:30:37.435765 2894 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Jan 23 18:30:37.477091 kubelet[2894]: I0123 18:30:37.477057 2894 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4547-1-0-2-32611d5cc2" Jan 23 18:30:37.477551 kubelet[2894]: I0123 18:30:37.477351 2894 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4547-1-0-2-32611d5cc2" Jan 23 18:30:37.477680 kubelet[2894]: I0123 18:30:37.477427 2894 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4547-1-0-2-32611d5cc2" Jan 23 18:30:37.490150 kubelet[2894]: E0123 18:30:37.490116 2894 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4547-1-0-2-32611d5cc2\" already exists" pod="kube-system/kube-scheduler-ci-4547-1-0-2-32611d5cc2" Jan 23 18:30:37.537838 kubelet[2894]: I0123 18:30:37.537805 2894 kubelet_node_status.go:75] "Attempting to register node" node="ci-4547-1-0-2-32611d5cc2" Jan 23 18:30:37.549443 kubelet[2894]: I0123 18:30:37.549218 2894 kubelet_node_status.go:124] "Node was previously registered" node="ci-4547-1-0-2-32611d5cc2" Jan 23 18:30:37.549605 kubelet[2894]: I0123 18:30:37.549530 2894 kubelet_node_status.go:78] "Successfully registered node" node="ci-4547-1-0-2-32611d5cc2" Jan 23 18:30:37.571972 kubelet[2894]: I0123 18:30:37.571829 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/52b3e3fd5c6d5319ece2e4ca03b9ebb0-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4547-1-0-2-32611d5cc2\" (UID: \"52b3e3fd5c6d5319ece2e4ca03b9ebb0\") " pod="kube-system/kube-apiserver-ci-4547-1-0-2-32611d5cc2" Jan 23 18:30:37.571972 kubelet[2894]: I0123 18:30:37.571868 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/3779b9220c588cc794d8ce6b970927e8-ca-certs\") pod \"kube-controller-manager-ci-4547-1-0-2-32611d5cc2\" (UID: \"3779b9220c588cc794d8ce6b970927e8\") " pod="kube-system/kube-controller-manager-ci-4547-1-0-2-32611d5cc2" Jan 23 18:30:37.571972 kubelet[2894]: I0123 18:30:37.571889 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/3779b9220c588cc794d8ce6b970927e8-flexvolume-dir\") pod \"kube-controller-manager-ci-4547-1-0-2-32611d5cc2\" (UID: \"3779b9220c588cc794d8ce6b970927e8\") " pod="kube-system/kube-controller-manager-ci-4547-1-0-2-32611d5cc2" Jan 23 18:30:37.571972 kubelet[2894]: I0123 18:30:37.571906 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/3779b9220c588cc794d8ce6b970927e8-k8s-certs\") pod \"kube-controller-manager-ci-4547-1-0-2-32611d5cc2\" (UID: \"3779b9220c588cc794d8ce6b970927e8\") " pod="kube-system/kube-controller-manager-ci-4547-1-0-2-32611d5cc2" Jan 23 18:30:37.571972 kubelet[2894]: I0123 18:30:37.571922 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/3779b9220c588cc794d8ce6b970927e8-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4547-1-0-2-32611d5cc2\" (UID: \"3779b9220c588cc794d8ce6b970927e8\") " pod="kube-system/kube-controller-manager-ci-4547-1-0-2-32611d5cc2" Jan 23 18:30:37.572239 kubelet[2894]: I0123 18:30:37.571939 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/9eef90d1ea79562f40cf400747a3d4c7-kubeconfig\") pod \"kube-scheduler-ci-4547-1-0-2-32611d5cc2\" (UID: \"9eef90d1ea79562f40cf400747a3d4c7\") " pod="kube-system/kube-scheduler-ci-4547-1-0-2-32611d5cc2" Jan 23 18:30:37.572239 kubelet[2894]: I0123 18:30:37.571987 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/52b3e3fd5c6d5319ece2e4ca03b9ebb0-ca-certs\") pod \"kube-apiserver-ci-4547-1-0-2-32611d5cc2\" (UID: \"52b3e3fd5c6d5319ece2e4ca03b9ebb0\") " pod="kube-system/kube-apiserver-ci-4547-1-0-2-32611d5cc2" Jan 23 18:30:37.572239 kubelet[2894]: I0123 18:30:37.572033 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/52b3e3fd5c6d5319ece2e4ca03b9ebb0-k8s-certs\") pod \"kube-apiserver-ci-4547-1-0-2-32611d5cc2\" (UID: \"52b3e3fd5c6d5319ece2e4ca03b9ebb0\") " pod="kube-system/kube-apiserver-ci-4547-1-0-2-32611d5cc2" Jan 23 18:30:37.572239 kubelet[2894]: I0123 18:30:37.572056 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/3779b9220c588cc794d8ce6b970927e8-kubeconfig\") pod \"kube-controller-manager-ci-4547-1-0-2-32611d5cc2\" (UID: \"3779b9220c588cc794d8ce6b970927e8\") " pod="kube-system/kube-controller-manager-ci-4547-1-0-2-32611d5cc2" Jan 23 18:30:38.350399 kubelet[2894]: I0123 18:30:38.350342 2894 apiserver.go:52] "Watching apiserver" Jan 23 18:30:38.370784 kubelet[2894]: I0123 18:30:38.370719 2894 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Jan 23 18:30:38.439286 kubelet[2894]: I0123 18:30:38.439086 2894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4547-1-0-2-32611d5cc2" podStartSLOduration=3.438944765 podStartE2EDuration="3.438944765s" podCreationTimestamp="2026-01-23 18:30:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 18:30:38.429721584 +0000 UTC m=+1.137057144" watchObservedRunningTime="2026-01-23 18:30:38.438944765 +0000 UTC m=+1.146280404" Jan 23 18:30:38.448836 kubelet[2894]: I0123 18:30:38.448679 2894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4547-1-0-2-32611d5cc2" podStartSLOduration=1.448659355 podStartE2EDuration="1.448659355s" podCreationTimestamp="2026-01-23 18:30:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 18:30:38.439378628 +0000 UTC m=+1.146714195" watchObservedRunningTime="2026-01-23 18:30:38.448659355 +0000 UTC m=+1.155994907" Jan 23 18:30:38.449713 kubelet[2894]: I0123 18:30:38.449543 2894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4547-1-0-2-32611d5cc2" podStartSLOduration=1.449532257 podStartE2EDuration="1.449532257s" podCreationTimestamp="2026-01-23 18:30:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 18:30:38.448918175 +0000 UTC m=+1.156253726" watchObservedRunningTime="2026-01-23 18:30:38.449532257 +0000 UTC m=+1.156867824" Jan 23 18:30:41.894388 update_engine[1661]: I20260123 18:30:41.894304 1661 update_attempter.cc:509] Updating boot flags... Jan 23 18:30:44.057468 kubelet[2894]: I0123 18:30:44.057323 2894 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Jan 23 18:30:44.058474 containerd[1690]: time="2026-01-23T18:30:44.058025072Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Jan 23 18:30:44.058731 kubelet[2894]: I0123 18:30:44.058194 2894 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Jan 23 18:30:45.110448 systemd[1]: Created slice kubepods-besteffort-pod0db5e6cc_308d_44f9_aa4d_38a096851738.slice - libcontainer container kubepods-besteffort-pod0db5e6cc_308d_44f9_aa4d_38a096851738.slice. Jan 23 18:30:45.124517 kubelet[2894]: I0123 18:30:45.124344 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0db5e6cc-308d-44f9-aa4d-38a096851738-lib-modules\") pod \"kube-proxy-kmq8g\" (UID: \"0db5e6cc-308d-44f9-aa4d-38a096851738\") " pod="kube-system/kube-proxy-kmq8g" Jan 23 18:30:45.124517 kubelet[2894]: I0123 18:30:45.124409 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnkwt\" (UniqueName: \"kubernetes.io/projected/0db5e6cc-308d-44f9-aa4d-38a096851738-kube-api-access-tnkwt\") pod \"kube-proxy-kmq8g\" (UID: \"0db5e6cc-308d-44f9-aa4d-38a096851738\") " pod="kube-system/kube-proxy-kmq8g" Jan 23 18:30:45.124517 kubelet[2894]: I0123 18:30:45.124465 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/0db5e6cc-308d-44f9-aa4d-38a096851738-kube-proxy\") pod \"kube-proxy-kmq8g\" (UID: \"0db5e6cc-308d-44f9-aa4d-38a096851738\") " pod="kube-system/kube-proxy-kmq8g" Jan 23 18:30:45.124517 kubelet[2894]: I0123 18:30:45.124480 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/0db5e6cc-308d-44f9-aa4d-38a096851738-xtables-lock\") pod \"kube-proxy-kmq8g\" (UID: \"0db5e6cc-308d-44f9-aa4d-38a096851738\") " pod="kube-system/kube-proxy-kmq8g" Jan 23 18:30:45.229700 systemd[1]: Created slice kubepods-besteffort-podb1519747_f595_415e_abc7_1bf97df28840.slice - libcontainer container kubepods-besteffort-podb1519747_f595_415e_abc7_1bf97df28840.slice. Jan 23 18:30:45.325609 kubelet[2894]: I0123 18:30:45.325562 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/b1519747-f595-415e-abc7-1bf97df28840-var-lib-calico\") pod \"tigera-operator-7dcd859c48-vts8f\" (UID: \"b1519747-f595-415e-abc7-1bf97df28840\") " pod="tigera-operator/tigera-operator-7dcd859c48-vts8f" Jan 23 18:30:45.325609 kubelet[2894]: I0123 18:30:45.325604 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dwkf\" (UniqueName: \"kubernetes.io/projected/b1519747-f595-415e-abc7-1bf97df28840-kube-api-access-2dwkf\") pod \"tigera-operator-7dcd859c48-vts8f\" (UID: \"b1519747-f595-415e-abc7-1bf97df28840\") " pod="tigera-operator/tigera-operator-7dcd859c48-vts8f" Jan 23 18:30:45.421579 containerd[1690]: time="2026-01-23T18:30:45.421196357Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-kmq8g,Uid:0db5e6cc-308d-44f9-aa4d-38a096851738,Namespace:kube-system,Attempt:0,}" Jan 23 18:30:45.447063 containerd[1690]: time="2026-01-23T18:30:45.447022597Z" level=info msg="connecting to shim ea25e9330596225ecf71d4ceb8e5ca3ae690f575bef23c2e906ecf1df1d0f6b2" address="unix:///run/containerd/s/56b5330f3f61e91f80373380cba1992fdc20dcef54d63b9a40e3f4458ee28ea1" namespace=k8s.io protocol=ttrpc version=3 Jan 23 18:30:45.470527 systemd[1]: Started cri-containerd-ea25e9330596225ecf71d4ceb8e5ca3ae690f575bef23c2e906ecf1df1d0f6b2.scope - libcontainer container ea25e9330596225ecf71d4ceb8e5ca3ae690f575bef23c2e906ecf1df1d0f6b2. Jan 23 18:30:45.478000 audit: BPF prog-id=133 op=LOAD Jan 23 18:30:45.480587 kernel: kauditd_printk_skb: 32 callbacks suppressed Jan 23 18:30:45.480641 kernel: audit: type=1334 audit(1769193045.478:442): prog-id=133 op=LOAD Jan 23 18:30:45.480000 audit: BPF prog-id=134 op=LOAD Jan 23 18:30:45.482566 kernel: audit: type=1334 audit(1769193045.480:443): prog-id=134 op=LOAD Jan 23 18:30:45.480000 audit[2976]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=2964 pid=2976 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:45.480000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6561323565393333303539363232356563663731643463656238653563 Jan 23 18:30:45.488682 kernel: audit: type=1300 audit(1769193045.480:443): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=2964 pid=2976 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:45.488724 kernel: audit: type=1327 audit(1769193045.480:443): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6561323565393333303539363232356563663731643463656238653563 Jan 23 18:30:45.491334 kernel: audit: type=1334 audit(1769193045.480:444): prog-id=134 op=UNLOAD Jan 23 18:30:45.480000 audit: BPF prog-id=134 op=UNLOAD Jan 23 18:30:45.480000 audit[2976]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2964 pid=2976 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:45.493505 kernel: audit: type=1300 audit(1769193045.480:444): arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2964 pid=2976 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:45.480000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6561323565393333303539363232356563663731643463656238653563 Jan 23 18:30:45.480000 audit: BPF prog-id=135 op=LOAD Jan 23 18:30:45.500330 kernel: audit: type=1327 audit(1769193045.480:444): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6561323565393333303539363232356563663731643463656238653563 Jan 23 18:30:45.500367 kernel: audit: type=1334 audit(1769193045.480:445): prog-id=135 op=LOAD Jan 23 18:30:45.480000 audit[2976]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=2964 pid=2976 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:45.508392 kernel: audit: type=1300 audit(1769193045.480:445): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=2964 pid=2976 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:45.480000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6561323565393333303539363232356563663731643463656238653563 Jan 23 18:30:45.480000 audit: BPF prog-id=136 op=LOAD Jan 23 18:30:45.512403 kernel: audit: type=1327 audit(1769193045.480:445): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6561323565393333303539363232356563663731643463656238653563 Jan 23 18:30:45.480000 audit[2976]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=2964 pid=2976 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:45.480000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6561323565393333303539363232356563663731643463656238653563 Jan 23 18:30:45.480000 audit: BPF prog-id=136 op=UNLOAD Jan 23 18:30:45.480000 audit[2976]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2964 pid=2976 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:45.480000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6561323565393333303539363232356563663731643463656238653563 Jan 23 18:30:45.480000 audit: BPF prog-id=135 op=UNLOAD Jan 23 18:30:45.480000 audit[2976]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2964 pid=2976 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:45.480000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6561323565393333303539363232356563663731643463656238653563 Jan 23 18:30:45.480000 audit: BPF prog-id=137 op=LOAD Jan 23 18:30:45.480000 audit[2976]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=2964 pid=2976 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:45.480000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6561323565393333303539363232356563663731643463656238653563 Jan 23 18:30:45.515185 containerd[1690]: time="2026-01-23T18:30:45.515147865Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-kmq8g,Uid:0db5e6cc-308d-44f9-aa4d-38a096851738,Namespace:kube-system,Attempt:0,} returns sandbox id \"ea25e9330596225ecf71d4ceb8e5ca3ae690f575bef23c2e906ecf1df1d0f6b2\"" Jan 23 18:30:45.519684 containerd[1690]: time="2026-01-23T18:30:45.519654369Z" level=info msg="CreateContainer within sandbox \"ea25e9330596225ecf71d4ceb8e5ca3ae690f575bef23c2e906ecf1df1d0f6b2\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Jan 23 18:30:45.532189 containerd[1690]: time="2026-01-23T18:30:45.530773552Z" level=info msg="Container 99b1c00abe11d81283514af6645abc21ee236c85ebb145c3916aab458d35ae76: CDI devices from CRI Config.CDIDevices: []" Jan 23 18:30:45.535711 containerd[1690]: time="2026-01-23T18:30:45.535680861Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-vts8f,Uid:b1519747-f595-415e-abc7-1bf97df28840,Namespace:tigera-operator,Attempt:0,}" Jan 23 18:30:45.537950 containerd[1690]: time="2026-01-23T18:30:45.537911851Z" level=info msg="CreateContainer within sandbox \"ea25e9330596225ecf71d4ceb8e5ca3ae690f575bef23c2e906ecf1df1d0f6b2\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"99b1c00abe11d81283514af6645abc21ee236c85ebb145c3916aab458d35ae76\"" Jan 23 18:30:45.538563 containerd[1690]: time="2026-01-23T18:30:45.538462236Z" level=info msg="StartContainer for \"99b1c00abe11d81283514af6645abc21ee236c85ebb145c3916aab458d35ae76\"" Jan 23 18:30:45.540659 containerd[1690]: time="2026-01-23T18:30:45.540635147Z" level=info msg="connecting to shim 99b1c00abe11d81283514af6645abc21ee236c85ebb145c3916aab458d35ae76" address="unix:///run/containerd/s/56b5330f3f61e91f80373380cba1992fdc20dcef54d63b9a40e3f4458ee28ea1" protocol=ttrpc version=3 Jan 23 18:30:45.559483 containerd[1690]: time="2026-01-23T18:30:45.559380891Z" level=info msg="connecting to shim 4c284b7020e8efa8eaa227d3bb2bba2873ff0ae88e46d26f23ef1e73a20f3b12" address="unix:///run/containerd/s/bcead1df0e8fa871db231778207e965752e21fe3e09f27e6bbb0a6a614b1dbf9" namespace=k8s.io protocol=ttrpc version=3 Jan 23 18:30:45.561600 systemd[1]: Started cri-containerd-99b1c00abe11d81283514af6645abc21ee236c85ebb145c3916aab458d35ae76.scope - libcontainer container 99b1c00abe11d81283514af6645abc21ee236c85ebb145c3916aab458d35ae76. Jan 23 18:30:45.594746 systemd[1]: Started cri-containerd-4c284b7020e8efa8eaa227d3bb2bba2873ff0ae88e46d26f23ef1e73a20f3b12.scope - libcontainer container 4c284b7020e8efa8eaa227d3bb2bba2873ff0ae88e46d26f23ef1e73a20f3b12. Jan 23 18:30:45.600000 audit: BPF prog-id=138 op=LOAD Jan 23 18:30:45.600000 audit[3001]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=2964 pid=3001 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:45.600000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3939623163303061626531316438313238333531346166363634356162 Jan 23 18:30:45.600000 audit: BPF prog-id=139 op=LOAD Jan 23 18:30:45.600000 audit[3001]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=2964 pid=3001 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:45.600000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3939623163303061626531316438313238333531346166363634356162 Jan 23 18:30:45.600000 audit: BPF prog-id=139 op=UNLOAD Jan 23 18:30:45.600000 audit[3001]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=2964 pid=3001 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:45.600000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3939623163303061626531316438313238333531346166363634356162 Jan 23 18:30:45.600000 audit: BPF prog-id=138 op=UNLOAD Jan 23 18:30:45.600000 audit[3001]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2964 pid=3001 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:45.600000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3939623163303061626531316438313238333531346166363634356162 Jan 23 18:30:45.600000 audit: BPF prog-id=140 op=LOAD Jan 23 18:30:45.600000 audit[3001]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=2964 pid=3001 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:45.600000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3939623163303061626531316438313238333531346166363634356162 Jan 23 18:30:45.616000 audit: BPF prog-id=141 op=LOAD Jan 23 18:30:45.617000 audit: BPF prog-id=142 op=LOAD Jan 23 18:30:45.617000 audit[3042]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001a0238 a2=98 a3=0 items=0 ppid=3022 pid=3042 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:45.617000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3463323834623730323065386566613865616132323764336262326262 Jan 23 18:30:45.617000 audit: BPF prog-id=142 op=UNLOAD Jan 23 18:30:45.617000 audit[3042]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3022 pid=3042 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:45.617000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3463323834623730323065386566613865616132323764336262326262 Jan 23 18:30:45.617000 audit: BPF prog-id=143 op=LOAD Jan 23 18:30:45.617000 audit[3042]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001a0488 a2=98 a3=0 items=0 ppid=3022 pid=3042 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:45.617000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3463323834623730323065386566613865616132323764336262326262 Jan 23 18:30:45.617000 audit: BPF prog-id=144 op=LOAD Jan 23 18:30:45.617000 audit[3042]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c0001a0218 a2=98 a3=0 items=0 ppid=3022 pid=3042 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:45.617000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3463323834623730323065386566613865616132323764336262326262 Jan 23 18:30:45.617000 audit: BPF prog-id=144 op=UNLOAD Jan 23 18:30:45.617000 audit[3042]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3022 pid=3042 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:45.617000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3463323834623730323065386566613865616132323764336262326262 Jan 23 18:30:45.617000 audit: BPF prog-id=143 op=UNLOAD Jan 23 18:30:45.617000 audit[3042]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3022 pid=3042 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:45.617000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3463323834623730323065386566613865616132323764336262326262 Jan 23 18:30:45.617000 audit: BPF prog-id=145 op=LOAD Jan 23 18:30:45.617000 audit[3042]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001a06e8 a2=98 a3=0 items=0 ppid=3022 pid=3042 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:45.617000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3463323834623730323065386566613865616132323764336262326262 Jan 23 18:30:45.634836 containerd[1690]: time="2026-01-23T18:30:45.634797880Z" level=info msg="StartContainer for \"99b1c00abe11d81283514af6645abc21ee236c85ebb145c3916aab458d35ae76\" returns successfully" Jan 23 18:30:45.662289 containerd[1690]: time="2026-01-23T18:30:45.662223255Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-vts8f,Uid:b1519747-f595-415e-abc7-1bf97df28840,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"4c284b7020e8efa8eaa227d3bb2bba2873ff0ae88e46d26f23ef1e73a20f3b12\"" Jan 23 18:30:45.664309 containerd[1690]: time="2026-01-23T18:30:45.664009107Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\"" Jan 23 18:30:45.762000 audit[3111]: NETFILTER_CFG table=mangle:54 family=2 entries=1 op=nft_register_chain pid=3111 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:45.762000 audit[3111]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7fffc9ca1540 a2=0 a3=7fffc9ca152c items=0 ppid=3032 pid=3111 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:45.762000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Jan 23 18:30:45.764000 audit[3113]: NETFILTER_CFG table=nat:55 family=2 entries=1 op=nft_register_chain pid=3113 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:45.764000 audit[3113]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffdfffbeed0 a2=0 a3=7ffdfffbeebc items=0 ppid=3032 pid=3113 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:45.764000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Jan 23 18:30:45.765000 audit[3114]: NETFILTER_CFG table=filter:56 family=2 entries=1 op=nft_register_chain pid=3114 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:45.765000 audit[3114]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffffd92eff0 a2=0 a3=7ffffd92efdc items=0 ppid=3032 pid=3114 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:45.765000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Jan 23 18:30:45.769000 audit[3112]: NETFILTER_CFG table=mangle:57 family=10 entries=1 op=nft_register_chain pid=3112 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:30:45.769000 audit[3112]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffc306bc3a0 a2=0 a3=7ffc306bc38c items=0 ppid=3032 pid=3112 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:45.769000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Jan 23 18:30:45.770000 audit[3116]: NETFILTER_CFG table=nat:58 family=10 entries=1 op=nft_register_chain pid=3116 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:30:45.770000 audit[3116]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffe6dcf3010 a2=0 a3=7ffe6dcf2ffc items=0 ppid=3032 pid=3116 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:45.770000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Jan 23 18:30:45.771000 audit[3117]: NETFILTER_CFG table=filter:59 family=10 entries=1 op=nft_register_chain pid=3117 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:30:45.771000 audit[3117]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffec72930a0 a2=0 a3=7ffec729308c items=0 ppid=3032 pid=3117 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:45.771000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Jan 23 18:30:45.870000 audit[3118]: NETFILTER_CFG table=filter:60 family=2 entries=1 op=nft_register_chain pid=3118 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:45.870000 audit[3118]: SYSCALL arch=c000003e syscall=46 success=yes exit=108 a0=3 a1=7ffe97efa360 a2=0 a3=7ffe97efa34c items=0 ppid=3032 pid=3118 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:45.870000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Jan 23 18:30:45.873000 audit[3120]: NETFILTER_CFG table=filter:61 family=2 entries=1 op=nft_register_rule pid=3120 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:45.873000 audit[3120]: SYSCALL arch=c000003e syscall=46 success=yes exit=752 a0=3 a1=7ffc95c81db0 a2=0 a3=7ffc95c81d9c items=0 ppid=3032 pid=3120 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:45.873000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C652073657276696365 Jan 23 18:30:45.876000 audit[3123]: NETFILTER_CFG table=filter:62 family=2 entries=1 op=nft_register_rule pid=3123 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:45.876000 audit[3123]: SYSCALL arch=c000003e syscall=46 success=yes exit=752 a0=3 a1=7ffd57daed20 a2=0 a3=7ffd57daed0c items=0 ppid=3032 pid=3123 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:45.876000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C65207365727669 Jan 23 18:30:45.877000 audit[3124]: NETFILTER_CFG table=filter:63 family=2 entries=1 op=nft_register_chain pid=3124 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:45.877000 audit[3124]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffda4b2ef40 a2=0 a3=7ffda4b2ef2c items=0 ppid=3032 pid=3124 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:45.877000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Jan 23 18:30:45.880000 audit[3126]: NETFILTER_CFG table=filter:64 family=2 entries=1 op=nft_register_rule pid=3126 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:45.880000 audit[3126]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffffad35e60 a2=0 a3=7ffffad35e4c items=0 ppid=3032 pid=3126 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:45.880000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Jan 23 18:30:45.881000 audit[3127]: NETFILTER_CFG table=filter:65 family=2 entries=1 op=nft_register_chain pid=3127 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:45.881000 audit[3127]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffe0e4d8b30 a2=0 a3=7ffe0e4d8b1c items=0 ppid=3032 pid=3127 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:45.881000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D740066696C746572 Jan 23 18:30:45.883000 audit[3129]: NETFILTER_CFG table=filter:66 family=2 entries=1 op=nft_register_rule pid=3129 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:45.883000 audit[3129]: SYSCALL arch=c000003e syscall=46 success=yes exit=744 a0=3 a1=7fff3466c670 a2=0 a3=7fff3466c65c items=0 ppid=3032 pid=3129 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:45.883000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D Jan 23 18:30:45.887000 audit[3132]: NETFILTER_CFG table=filter:67 family=2 entries=1 op=nft_register_rule pid=3132 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:45.887000 audit[3132]: SYSCALL arch=c000003e syscall=46 success=yes exit=744 a0=3 a1=7fff34b9bc60 a2=0 a3=7fff34b9bc4c items=0 ppid=3032 pid=3132 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:45.887000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D53 Jan 23 18:30:45.888000 audit[3133]: NETFILTER_CFG table=filter:68 family=2 entries=1 op=nft_register_chain pid=3133 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:45.888000 audit[3133]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffd05dbb8e0 a2=0 a3=7ffd05dbb8cc items=0 ppid=3032 pid=3133 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:45.888000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D464F5257415244002D740066696C746572 Jan 23 18:30:45.891000 audit[3135]: NETFILTER_CFG table=filter:69 family=2 entries=1 op=nft_register_rule pid=3135 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:45.891000 audit[3135]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffe49eb5a50 a2=0 a3=7ffe49eb5a3c items=0 ppid=3032 pid=3135 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:45.891000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Jan 23 18:30:45.892000 audit[3136]: NETFILTER_CFG table=filter:70 family=2 entries=1 op=nft_register_chain pid=3136 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:45.892000 audit[3136]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffc18fe4c70 a2=0 a3=7ffc18fe4c5c items=0 ppid=3032 pid=3136 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:45.892000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Jan 23 18:30:45.897000 audit[3138]: NETFILTER_CFG table=filter:71 family=2 entries=1 op=nft_register_rule pid=3138 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:45.897000 audit[3138]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffef561c7b0 a2=0 a3=7ffef561c79c items=0 ppid=3032 pid=3138 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:45.897000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Jan 23 18:30:45.900000 audit[3141]: NETFILTER_CFG table=filter:72 family=2 entries=1 op=nft_register_rule pid=3141 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:45.900000 audit[3141]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffec4c0b8a0 a2=0 a3=7ffec4c0b88c items=0 ppid=3032 pid=3141 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:45.900000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Jan 23 18:30:45.904000 audit[3144]: NETFILTER_CFG table=filter:73 family=2 entries=1 op=nft_register_rule pid=3144 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:45.904000 audit[3144]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffd5f553f70 a2=0 a3=7ffd5f553f5c items=0 ppid=3032 pid=3144 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:45.904000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D Jan 23 18:30:45.905000 audit[3145]: NETFILTER_CFG table=nat:74 family=2 entries=1 op=nft_register_chain pid=3145 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:45.905000 audit[3145]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffe66c918f0 a2=0 a3=7ffe66c918dc items=0 ppid=3032 pid=3145 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:45.905000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D74006E6174 Jan 23 18:30:45.908000 audit[3147]: NETFILTER_CFG table=nat:75 family=2 entries=1 op=nft_register_rule pid=3147 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:45.908000 audit[3147]: SYSCALL arch=c000003e syscall=46 success=yes exit=524 a0=3 a1=7ffc4a242230 a2=0 a3=7ffc4a24221c items=0 ppid=3032 pid=3147 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:45.908000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 23 18:30:45.911000 audit[3150]: NETFILTER_CFG table=nat:76 family=2 entries=1 op=nft_register_rule pid=3150 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:45.911000 audit[3150]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffdb9e69b10 a2=0 a3=7ffdb9e69afc items=0 ppid=3032 pid=3150 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:45.911000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 23 18:30:45.912000 audit[3151]: NETFILTER_CFG table=nat:77 family=2 entries=1 op=nft_register_chain pid=3151 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:45.912000 audit[3151]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffe2e25fb50 a2=0 a3=7ffe2e25fb3c items=0 ppid=3032 pid=3151 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:45.912000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Jan 23 18:30:45.915000 audit[3153]: NETFILTER_CFG table=nat:78 family=2 entries=1 op=nft_register_rule pid=3153 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:45.915000 audit[3153]: SYSCALL arch=c000003e syscall=46 success=yes exit=532 a0=3 a1=7ffe34098550 a2=0 a3=7ffe3409853c items=0 ppid=3032 pid=3153 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:45.915000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Jan 23 18:30:45.940000 audit[3159]: NETFILTER_CFG table=filter:79 family=2 entries=8 op=nft_register_rule pid=3159 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:30:45.940000 audit[3159]: SYSCALL arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7ffc2893dfb0 a2=0 a3=7ffc2893df9c items=0 ppid=3032 pid=3159 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:45.940000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:30:45.950000 audit[3159]: NETFILTER_CFG table=nat:80 family=2 entries=14 op=nft_register_chain pid=3159 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:30:45.950000 audit[3159]: SYSCALL arch=c000003e syscall=46 success=yes exit=5508 a0=3 a1=7ffc2893dfb0 a2=0 a3=7ffc2893df9c items=0 ppid=3032 pid=3159 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:45.950000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:30:45.952000 audit[3164]: NETFILTER_CFG table=filter:81 family=10 entries=1 op=nft_register_chain pid=3164 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:30:45.952000 audit[3164]: SYSCALL arch=c000003e syscall=46 success=yes exit=108 a0=3 a1=7ffcd2992c00 a2=0 a3=7ffcd2992bec items=0 ppid=3032 pid=3164 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:45.952000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Jan 23 18:30:45.954000 audit[3166]: NETFILTER_CFG table=filter:82 family=10 entries=2 op=nft_register_chain pid=3166 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:30:45.954000 audit[3166]: SYSCALL arch=c000003e syscall=46 success=yes exit=836 a0=3 a1=7ffced27c460 a2=0 a3=7ffced27c44c items=0 ppid=3032 pid=3166 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:45.954000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C6520736572766963 Jan 23 18:30:45.958000 audit[3169]: NETFILTER_CFG table=filter:83 family=10 entries=1 op=nft_register_rule pid=3169 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:30:45.958000 audit[3169]: SYSCALL arch=c000003e syscall=46 success=yes exit=752 a0=3 a1=7fff1f527360 a2=0 a3=7fff1f52734c items=0 ppid=3032 pid=3169 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:45.958000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C652073657276 Jan 23 18:30:45.959000 audit[3170]: NETFILTER_CFG table=filter:84 family=10 entries=1 op=nft_register_chain pid=3170 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:30:45.959000 audit[3170]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffe7d8550f0 a2=0 a3=7ffe7d8550dc items=0 ppid=3032 pid=3170 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:45.959000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Jan 23 18:30:45.962000 audit[3172]: NETFILTER_CFG table=filter:85 family=10 entries=1 op=nft_register_rule pid=3172 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:30:45.962000 audit[3172]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7fffaafbfcd0 a2=0 a3=7fffaafbfcbc items=0 ppid=3032 pid=3172 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:45.962000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Jan 23 18:30:45.963000 audit[3173]: NETFILTER_CFG table=filter:86 family=10 entries=1 op=nft_register_chain pid=3173 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:30:45.963000 audit[3173]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fff4c915890 a2=0 a3=7fff4c91587c items=0 ppid=3032 pid=3173 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:45.963000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D740066696C746572 Jan 23 18:30:45.965000 audit[3175]: NETFILTER_CFG table=filter:87 family=10 entries=1 op=nft_register_rule pid=3175 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:30:45.965000 audit[3175]: SYSCALL arch=c000003e syscall=46 success=yes exit=744 a0=3 a1=7ffe18eeae10 a2=0 a3=7ffe18eeadfc items=0 ppid=3032 pid=3175 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:45.965000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B554245 Jan 23 18:30:45.969000 audit[3178]: NETFILTER_CFG table=filter:88 family=10 entries=2 op=nft_register_chain pid=3178 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:30:45.969000 audit[3178]: SYSCALL arch=c000003e syscall=46 success=yes exit=828 a0=3 a1=7fff92fcffa0 a2=0 a3=7fff92fcff8c items=0 ppid=3032 pid=3178 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:45.969000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D Jan 23 18:30:45.970000 audit[3179]: NETFILTER_CFG table=filter:89 family=10 entries=1 op=nft_register_chain pid=3179 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:30:45.970000 audit[3179]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fff73bb0bd0 a2=0 a3=7fff73bb0bbc items=0 ppid=3032 pid=3179 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:45.970000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D464F5257415244002D740066696C746572 Jan 23 18:30:45.972000 audit[3181]: NETFILTER_CFG table=filter:90 family=10 entries=1 op=nft_register_rule pid=3181 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:30:45.972000 audit[3181]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffea08d9210 a2=0 a3=7ffea08d91fc items=0 ppid=3032 pid=3181 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:45.972000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Jan 23 18:30:45.973000 audit[3182]: NETFILTER_CFG table=filter:91 family=10 entries=1 op=nft_register_chain pid=3182 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:30:45.973000 audit[3182]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7fff96fd7840 a2=0 a3=7fff96fd782c items=0 ppid=3032 pid=3182 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:45.973000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Jan 23 18:30:45.976000 audit[3184]: NETFILTER_CFG table=filter:92 family=10 entries=1 op=nft_register_rule pid=3184 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:30:45.976000 audit[3184]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffd408e4080 a2=0 a3=7ffd408e406c items=0 ppid=3032 pid=3184 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:45.976000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Jan 23 18:30:45.979000 audit[3187]: NETFILTER_CFG table=filter:93 family=10 entries=1 op=nft_register_rule pid=3187 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:30:45.979000 audit[3187]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7fff35a2d870 a2=0 a3=7fff35a2d85c items=0 ppid=3032 pid=3187 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:45.979000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D Jan 23 18:30:45.982000 audit[3190]: NETFILTER_CFG table=filter:94 family=10 entries=1 op=nft_register_rule pid=3190 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:30:45.982000 audit[3190]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7fff37ff2080 a2=0 a3=7fff37ff206c items=0 ppid=3032 pid=3190 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:45.982000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C Jan 23 18:30:45.983000 audit[3191]: NETFILTER_CFG table=nat:95 family=10 entries=1 op=nft_register_chain pid=3191 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:30:45.983000 audit[3191]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffebb142000 a2=0 a3=7ffebb141fec items=0 ppid=3032 pid=3191 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:45.983000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D74006E6174 Jan 23 18:30:45.986000 audit[3193]: NETFILTER_CFG table=nat:96 family=10 entries=1 op=nft_register_rule pid=3193 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:30:45.986000 audit[3193]: SYSCALL arch=c000003e syscall=46 success=yes exit=524 a0=3 a1=7fff4aa0f860 a2=0 a3=7fff4aa0f84c items=0 ppid=3032 pid=3193 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:45.986000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 23 18:30:45.989000 audit[3196]: NETFILTER_CFG table=nat:97 family=10 entries=1 op=nft_register_rule pid=3196 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:30:45.989000 audit[3196]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffc7e514d70 a2=0 a3=7ffc7e514d5c items=0 ppid=3032 pid=3196 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:45.989000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 23 18:30:45.990000 audit[3197]: NETFILTER_CFG table=nat:98 family=10 entries=1 op=nft_register_chain pid=3197 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:30:45.990000 audit[3197]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffe3f931280 a2=0 a3=7ffe3f93126c items=0 ppid=3032 pid=3197 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:45.990000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Jan 23 18:30:45.993000 audit[3199]: NETFILTER_CFG table=nat:99 family=10 entries=2 op=nft_register_chain pid=3199 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:30:45.993000 audit[3199]: SYSCALL arch=c000003e syscall=46 success=yes exit=612 a0=3 a1=7ffcfffd6ef0 a2=0 a3=7ffcfffd6edc items=0 ppid=3032 pid=3199 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:45.993000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Jan 23 18:30:45.994000 audit[3200]: NETFILTER_CFG table=filter:100 family=10 entries=1 op=nft_register_chain pid=3200 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:30:45.994000 audit[3200]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fff27814030 a2=0 a3=7fff2781401c items=0 ppid=3032 pid=3200 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:45.994000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4649524557414C4C002D740066696C746572 Jan 23 18:30:45.996000 audit[3202]: NETFILTER_CFG table=filter:101 family=10 entries=1 op=nft_register_rule pid=3202 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:30:45.996000 audit[3202]: SYSCALL arch=c000003e syscall=46 success=yes exit=228 a0=3 a1=7ffdcffa0400 a2=0 a3=7ffdcffa03ec items=0 ppid=3032 pid=3202 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:45.996000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 23 18:30:46.000000 audit[3205]: NETFILTER_CFG table=filter:102 family=10 entries=1 op=nft_register_rule pid=3205 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:30:46.000000 audit[3205]: SYSCALL arch=c000003e syscall=46 success=yes exit=228 a0=3 a1=7ffce5b695e0 a2=0 a3=7ffce5b695cc items=0 ppid=3032 pid=3205 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:46.000000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 23 18:30:46.005000 audit[3207]: NETFILTER_CFG table=filter:103 family=10 entries=3 op=nft_register_rule pid=3207 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Jan 23 18:30:46.005000 audit[3207]: SYSCALL arch=c000003e syscall=46 success=yes exit=2088 a0=3 a1=7fff3e753b50 a2=0 a3=7fff3e753b3c items=0 ppid=3032 pid=3207 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:46.005000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:30:46.006000 audit[3207]: NETFILTER_CFG table=nat:104 family=10 entries=7 op=nft_register_chain pid=3207 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Jan 23 18:30:46.006000 audit[3207]: SYSCALL arch=c000003e syscall=46 success=yes exit=2056 a0=3 a1=7fff3e753b50 a2=0 a3=7fff3e753b3c items=0 ppid=3032 pid=3207 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:46.006000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:30:46.247495 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3618278938.mount: Deactivated successfully. Jan 23 18:30:47.291110 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2738008978.mount: Deactivated successfully. Jan 23 18:30:48.130270 containerd[1690]: time="2026-01-23T18:30:48.130210275Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 18:30:48.131529 containerd[1690]: time="2026-01-23T18:30:48.131342117Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.7: active requests=0, bytes read=23558205" Jan 23 18:30:48.132319 containerd[1690]: time="2026-01-23T18:30:48.132296139Z" level=info msg="ImageCreate event name:\"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 18:30:48.134324 containerd[1690]: time="2026-01-23T18:30:48.134295950Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 18:30:48.134881 containerd[1690]: time="2026-01-23T18:30:48.134796442Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.7\" with image id \"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\", repo tag \"quay.io/tigera/operator:v1.38.7\", repo digest \"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\", size \"25057686\" in 2.470758997s" Jan 23 18:30:48.134881 containerd[1690]: time="2026-01-23T18:30:48.134819222Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\" returns image reference \"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\"" Jan 23 18:30:48.137210 containerd[1690]: time="2026-01-23T18:30:48.137182581Z" level=info msg="CreateContainer within sandbox \"4c284b7020e8efa8eaa227d3bb2bba2873ff0ae88e46d26f23ef1e73a20f3b12\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Jan 23 18:30:48.146666 containerd[1690]: time="2026-01-23T18:30:48.146546632Z" level=info msg="Container 1602eaff6cd73e1631bf9dd9f7e00d690e7452702bb3d7c599059bcadf4e9dc0: CDI devices from CRI Config.CDIDevices: []" Jan 23 18:30:48.149121 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount386092603.mount: Deactivated successfully. Jan 23 18:30:48.163145 containerd[1690]: time="2026-01-23T18:30:48.163104478Z" level=info msg="CreateContainer within sandbox \"4c284b7020e8efa8eaa227d3bb2bba2873ff0ae88e46d26f23ef1e73a20f3b12\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"1602eaff6cd73e1631bf9dd9f7e00d690e7452702bb3d7c599059bcadf4e9dc0\"" Jan 23 18:30:48.164761 containerd[1690]: time="2026-01-23T18:30:48.164727001Z" level=info msg="StartContainer for \"1602eaff6cd73e1631bf9dd9f7e00d690e7452702bb3d7c599059bcadf4e9dc0\"" Jan 23 18:30:48.165803 containerd[1690]: time="2026-01-23T18:30:48.165769975Z" level=info msg="connecting to shim 1602eaff6cd73e1631bf9dd9f7e00d690e7452702bb3d7c599059bcadf4e9dc0" address="unix:///run/containerd/s/bcead1df0e8fa871db231778207e965752e21fe3e09f27e6bbb0a6a614b1dbf9" protocol=ttrpc version=3 Jan 23 18:30:48.188465 systemd[1]: Started cri-containerd-1602eaff6cd73e1631bf9dd9f7e00d690e7452702bb3d7c599059bcadf4e9dc0.scope - libcontainer container 1602eaff6cd73e1631bf9dd9f7e00d690e7452702bb3d7c599059bcadf4e9dc0. Jan 23 18:30:48.196000 audit: BPF prog-id=146 op=LOAD Jan 23 18:30:48.197000 audit: BPF prog-id=147 op=LOAD Jan 23 18:30:48.197000 audit[3217]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=3022 pid=3217 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:48.197000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3136303265616666366364373365313633316266396464396637653030 Jan 23 18:30:48.197000 audit: BPF prog-id=147 op=UNLOAD Jan 23 18:30:48.197000 audit[3217]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3022 pid=3217 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:48.197000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3136303265616666366364373365313633316266396464396637653030 Jan 23 18:30:48.197000 audit: BPF prog-id=148 op=LOAD Jan 23 18:30:48.197000 audit[3217]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=3022 pid=3217 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:48.197000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3136303265616666366364373365313633316266396464396637653030 Jan 23 18:30:48.197000 audit: BPF prog-id=149 op=LOAD Jan 23 18:30:48.197000 audit[3217]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=3022 pid=3217 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:48.197000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3136303265616666366364373365313633316266396464396637653030 Jan 23 18:30:48.197000 audit: BPF prog-id=149 op=UNLOAD Jan 23 18:30:48.197000 audit[3217]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3022 pid=3217 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:48.197000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3136303265616666366364373365313633316266396464396637653030 Jan 23 18:30:48.197000 audit: BPF prog-id=148 op=UNLOAD Jan 23 18:30:48.197000 audit[3217]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3022 pid=3217 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:48.197000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3136303265616666366364373365313633316266396464396637653030 Jan 23 18:30:48.197000 audit: BPF prog-id=150 op=LOAD Jan 23 18:30:48.197000 audit[3217]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=3022 pid=3217 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:48.197000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3136303265616666366364373365313633316266396464396637653030 Jan 23 18:30:48.213533 containerd[1690]: time="2026-01-23T18:30:48.213484908Z" level=info msg="StartContainer for \"1602eaff6cd73e1631bf9dd9f7e00d690e7452702bb3d7c599059bcadf4e9dc0\" returns successfully" Jan 23 18:30:48.436026 kubelet[2894]: I0123 18:30:48.435869 2894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-kmq8g" podStartSLOduration=3.43584776 podStartE2EDuration="3.43584776s" podCreationTimestamp="2026-01-23 18:30:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 18:30:46.430602709 +0000 UTC m=+9.137938275" watchObservedRunningTime="2026-01-23 18:30:48.43584776 +0000 UTC m=+11.143183322" Jan 23 18:30:48.436026 kubelet[2894]: I0123 18:30:48.435985 2894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-7dcd859c48-vts8f" podStartSLOduration=0.963867889 podStartE2EDuration="3.435980959s" podCreationTimestamp="2026-01-23 18:30:45 +0000 UTC" firstStartedPulling="2026-01-23 18:30:45.66343416 +0000 UTC m=+8.370769705" lastFinishedPulling="2026-01-23 18:30:48.135547232 +0000 UTC m=+10.842882775" observedRunningTime="2026-01-23 18:30:48.434621985 +0000 UTC m=+11.141957549" watchObservedRunningTime="2026-01-23 18:30:48.435980959 +0000 UTC m=+11.143316526" Jan 23 18:30:53.736059 sudo[1960]: pam_unix(sudo:session): session closed for user root Jan 23 18:30:53.734000 audit[1960]: USER_END pid=1960 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 23 18:30:53.736744 kernel: kauditd_printk_skb: 224 callbacks suppressed Jan 23 18:30:53.736820 kernel: audit: type=1106 audit(1769193053.734:522): pid=1960 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 23 18:30:53.739000 audit[1960]: CRED_DISP pid=1960 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 23 18:30:53.747277 kernel: audit: type=1104 audit(1769193053.739:523): pid=1960 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 23 18:30:53.838289 sshd[1959]: Connection closed by 68.220.241.50 port 34570 Jan 23 18:30:53.839437 sshd-session[1955]: pam_unix(sshd:session): session closed for user core Jan 23 18:30:53.839000 audit[1955]: USER_END pid=1955 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:30:53.847033 systemd[1]: sshd@8-10.0.9.101:22-68.220.241.50:34570.service: Deactivated successfully. Jan 23 18:30:53.847287 kernel: audit: type=1106 audit(1769193053.839:524): pid=1955 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:30:53.839000 audit[1955]: CRED_DISP pid=1955 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:30:53.852568 systemd[1]: session-10.scope: Deactivated successfully. Jan 23 18:30:53.853280 kernel: audit: type=1104 audit(1769193053.839:525): pid=1955 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:30:53.853904 systemd[1]: session-10.scope: Consumed 4.168s CPU time, 232M memory peak. Jan 23 18:30:53.858443 kernel: audit: type=1131 audit(1769193053.846:526): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-10.0.9.101:22-68.220.241.50:34570 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:53.846000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-10.0.9.101:22-68.220.241.50:34570 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:53.858136 systemd-logind[1659]: Session 10 logged out. Waiting for processes to exit. Jan 23 18:30:53.860358 systemd-logind[1659]: Removed session 10. Jan 23 18:30:54.642000 audit[3293]: NETFILTER_CFG table=filter:105 family=2 entries=15 op=nft_register_rule pid=3293 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:30:54.646989 kernel: audit: type=1325 audit(1769193054.642:527): table=filter:105 family=2 entries=15 op=nft_register_rule pid=3293 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:30:54.642000 audit[3293]: SYSCALL arch=c000003e syscall=46 success=yes exit=5992 a0=3 a1=7ffe21ec2aa0 a2=0 a3=7ffe21ec2a8c items=0 ppid=3032 pid=3293 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:54.651273 kernel: audit: type=1300 audit(1769193054.642:527): arch=c000003e syscall=46 success=yes exit=5992 a0=3 a1=7ffe21ec2aa0 a2=0 a3=7ffe21ec2a8c items=0 ppid=3032 pid=3293 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:54.642000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:30:54.650000 audit[3293]: NETFILTER_CFG table=nat:106 family=2 entries=12 op=nft_register_rule pid=3293 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:30:54.655340 kernel: audit: type=1327 audit(1769193054.642:527): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:30:54.655396 kernel: audit: type=1325 audit(1769193054.650:528): table=nat:106 family=2 entries=12 op=nft_register_rule pid=3293 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:30:54.650000 audit[3293]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffe21ec2aa0 a2=0 a3=0 items=0 ppid=3032 pid=3293 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:54.658970 kernel: audit: type=1300 audit(1769193054.650:528): arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffe21ec2aa0 a2=0 a3=0 items=0 ppid=3032 pid=3293 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:54.650000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:30:54.665000 audit[3295]: NETFILTER_CFG table=filter:107 family=2 entries=16 op=nft_register_rule pid=3295 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:30:54.665000 audit[3295]: SYSCALL arch=c000003e syscall=46 success=yes exit=5992 a0=3 a1=7ffc65a3c900 a2=0 a3=7ffc65a3c8ec items=0 ppid=3032 pid=3295 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:54.665000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:30:54.671000 audit[3295]: NETFILTER_CFG table=nat:108 family=2 entries=12 op=nft_register_rule pid=3295 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:30:54.671000 audit[3295]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffc65a3c900 a2=0 a3=0 items=0 ppid=3032 pid=3295 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:54.671000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:30:57.983000 audit[3300]: NETFILTER_CFG table=filter:109 family=2 entries=17 op=nft_register_rule pid=3300 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:30:57.983000 audit[3300]: SYSCALL arch=c000003e syscall=46 success=yes exit=6736 a0=3 a1=7ffd310f8ac0 a2=0 a3=7ffd310f8aac items=0 ppid=3032 pid=3300 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:57.983000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:30:57.997000 audit[3300]: NETFILTER_CFG table=nat:110 family=2 entries=12 op=nft_register_rule pid=3300 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:30:57.997000 audit[3300]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffd310f8ac0 a2=0 a3=0 items=0 ppid=3032 pid=3300 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:57.997000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:30:58.015000 audit[3302]: NETFILTER_CFG table=filter:111 family=2 entries=18 op=nft_register_rule pid=3302 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:30:58.015000 audit[3302]: SYSCALL arch=c000003e syscall=46 success=yes exit=6736 a0=3 a1=7fff968a83d0 a2=0 a3=7fff968a83bc items=0 ppid=3032 pid=3302 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:58.015000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:30:58.020000 audit[3302]: NETFILTER_CFG table=nat:112 family=2 entries=12 op=nft_register_rule pid=3302 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:30:58.020000 audit[3302]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7fff968a83d0 a2=0 a3=0 items=0 ppid=3032 pid=3302 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:58.020000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:30:59.038000 audit[3304]: NETFILTER_CFG table=filter:113 family=2 entries=19 op=nft_register_rule pid=3304 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:30:59.040713 kernel: kauditd_printk_skb: 19 callbacks suppressed Jan 23 18:30:59.040749 kernel: audit: type=1325 audit(1769193059.038:535): table=filter:113 family=2 entries=19 op=nft_register_rule pid=3304 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:30:59.038000 audit[3304]: SYSCALL arch=c000003e syscall=46 success=yes exit=7480 a0=3 a1=7ffdbe5fe090 a2=0 a3=7ffdbe5fe07c items=0 ppid=3032 pid=3304 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:59.045584 kernel: audit: type=1300 audit(1769193059.038:535): arch=c000003e syscall=46 success=yes exit=7480 a0=3 a1=7ffdbe5fe090 a2=0 a3=7ffdbe5fe07c items=0 ppid=3032 pid=3304 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:59.038000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:30:59.051266 kernel: audit: type=1327 audit(1769193059.038:535): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:30:59.047000 audit[3304]: NETFILTER_CFG table=nat:114 family=2 entries=12 op=nft_register_rule pid=3304 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:30:59.047000 audit[3304]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffdbe5fe090 a2=0 a3=0 items=0 ppid=3032 pid=3304 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:59.056019 kernel: audit: type=1325 audit(1769193059.047:536): table=nat:114 family=2 entries=12 op=nft_register_rule pid=3304 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:30:59.056067 kernel: audit: type=1300 audit(1769193059.047:536): arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffdbe5fe090 a2=0 a3=0 items=0 ppid=3032 pid=3304 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:59.047000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:30:59.062280 kernel: audit: type=1327 audit(1769193059.047:536): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:30:59.645117 systemd[1]: Created slice kubepods-besteffort-pod456ee9f9_f37c_4b2f_bc1c_fb374e84b8d8.slice - libcontainer container kubepods-besteffort-pod456ee9f9_f37c_4b2f_bc1c_fb374e84b8d8.slice. Jan 23 18:30:59.712215 kubelet[2894]: I0123 18:30:59.712172 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/456ee9f9-f37c-4b2f-bc1c-fb374e84b8d8-typha-certs\") pod \"calico-typha-788cb4b5fc-w4bjl\" (UID: \"456ee9f9-f37c-4b2f-bc1c-fb374e84b8d8\") " pod="calico-system/calico-typha-788cb4b5fc-w4bjl" Jan 23 18:30:59.712215 kubelet[2894]: I0123 18:30:59.712214 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxhmb\" (UniqueName: \"kubernetes.io/projected/456ee9f9-f37c-4b2f-bc1c-fb374e84b8d8-kube-api-access-jxhmb\") pod \"calico-typha-788cb4b5fc-w4bjl\" (UID: \"456ee9f9-f37c-4b2f-bc1c-fb374e84b8d8\") " pod="calico-system/calico-typha-788cb4b5fc-w4bjl" Jan 23 18:30:59.712614 kubelet[2894]: I0123 18:30:59.712235 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/456ee9f9-f37c-4b2f-bc1c-fb374e84b8d8-tigera-ca-bundle\") pod \"calico-typha-788cb4b5fc-w4bjl\" (UID: \"456ee9f9-f37c-4b2f-bc1c-fb374e84b8d8\") " pod="calico-system/calico-typha-788cb4b5fc-w4bjl" Jan 23 18:30:59.818409 kubelet[2894]: I0123 18:30:59.817928 2894 status_manager.go:890] "Failed to get status for pod" podUID="d56a2bd8-afdd-445c-8c0d-125ed8e25d42" pod="calico-system/calico-node-468qx" err="pods \"calico-node-468qx\" is forbidden: User \"system:node:ci-4547-1-0-2-32611d5cc2\" cannot get resource \"pods\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'ci-4547-1-0-2-32611d5cc2' and this object" Jan 23 18:30:59.827189 systemd[1]: Created slice kubepods-besteffort-podd56a2bd8_afdd_445c_8c0d_125ed8e25d42.slice - libcontainer container kubepods-besteffort-podd56a2bd8_afdd_445c_8c0d_125ed8e25d42.slice. Jan 23 18:30:59.913816 kubelet[2894]: I0123 18:30:59.913381 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/d56a2bd8-afdd-445c-8c0d-125ed8e25d42-cni-bin-dir\") pod \"calico-node-468qx\" (UID: \"d56a2bd8-afdd-445c-8c0d-125ed8e25d42\") " pod="calico-system/calico-node-468qx" Jan 23 18:30:59.913816 kubelet[2894]: I0123 18:30:59.913428 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/d56a2bd8-afdd-445c-8c0d-125ed8e25d42-cni-log-dir\") pod \"calico-node-468qx\" (UID: \"d56a2bd8-afdd-445c-8c0d-125ed8e25d42\") " pod="calico-system/calico-node-468qx" Jan 23 18:30:59.913816 kubelet[2894]: I0123 18:30:59.913446 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/d56a2bd8-afdd-445c-8c0d-125ed8e25d42-cni-net-dir\") pod \"calico-node-468qx\" (UID: \"d56a2bd8-afdd-445c-8c0d-125ed8e25d42\") " pod="calico-system/calico-node-468qx" Jan 23 18:30:59.913816 kubelet[2894]: I0123 18:30:59.913461 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/d56a2bd8-afdd-445c-8c0d-125ed8e25d42-flexvol-driver-host\") pod \"calico-node-468qx\" (UID: \"d56a2bd8-afdd-445c-8c0d-125ed8e25d42\") " pod="calico-system/calico-node-468qx" Jan 23 18:30:59.913816 kubelet[2894]: I0123 18:30:59.913476 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d56a2bd8-afdd-445c-8c0d-125ed8e25d42-lib-modules\") pod \"calico-node-468qx\" (UID: \"d56a2bd8-afdd-445c-8c0d-125ed8e25d42\") " pod="calico-system/calico-node-468qx" Jan 23 18:30:59.913999 kubelet[2894]: I0123 18:30:59.913492 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/d56a2bd8-afdd-445c-8c0d-125ed8e25d42-node-certs\") pod \"calico-node-468qx\" (UID: \"d56a2bd8-afdd-445c-8c0d-125ed8e25d42\") " pod="calico-system/calico-node-468qx" Jan 23 18:30:59.913999 kubelet[2894]: I0123 18:30:59.913508 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/d56a2bd8-afdd-445c-8c0d-125ed8e25d42-xtables-lock\") pod \"calico-node-468qx\" (UID: \"d56a2bd8-afdd-445c-8c0d-125ed8e25d42\") " pod="calico-system/calico-node-468qx" Jan 23 18:30:59.913999 kubelet[2894]: I0123 18:30:59.913542 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/d56a2bd8-afdd-445c-8c0d-125ed8e25d42-var-lib-calico\") pod \"calico-node-468qx\" (UID: \"d56a2bd8-afdd-445c-8c0d-125ed8e25d42\") " pod="calico-system/calico-node-468qx" Jan 23 18:30:59.913999 kubelet[2894]: I0123 18:30:59.913567 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p245n\" (UniqueName: \"kubernetes.io/projected/d56a2bd8-afdd-445c-8c0d-125ed8e25d42-kube-api-access-p245n\") pod \"calico-node-468qx\" (UID: \"d56a2bd8-afdd-445c-8c0d-125ed8e25d42\") " pod="calico-system/calico-node-468qx" Jan 23 18:30:59.913999 kubelet[2894]: I0123 18:30:59.913582 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d56a2bd8-afdd-445c-8c0d-125ed8e25d42-tigera-ca-bundle\") pod \"calico-node-468qx\" (UID: \"d56a2bd8-afdd-445c-8c0d-125ed8e25d42\") " pod="calico-system/calico-node-468qx" Jan 23 18:30:59.914096 kubelet[2894]: I0123 18:30:59.913598 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/d56a2bd8-afdd-445c-8c0d-125ed8e25d42-policysync\") pod \"calico-node-468qx\" (UID: \"d56a2bd8-afdd-445c-8c0d-125ed8e25d42\") " pod="calico-system/calico-node-468qx" Jan 23 18:30:59.914096 kubelet[2894]: I0123 18:30:59.913612 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/d56a2bd8-afdd-445c-8c0d-125ed8e25d42-var-run-calico\") pod \"calico-node-468qx\" (UID: \"d56a2bd8-afdd-445c-8c0d-125ed8e25d42\") " pod="calico-system/calico-node-468qx" Jan 23 18:30:59.949631 containerd[1690]: time="2026-01-23T18:30:59.949590114Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-788cb4b5fc-w4bjl,Uid:456ee9f9-f37c-4b2f-bc1c-fb374e84b8d8,Namespace:calico-system,Attempt:0,}" Jan 23 18:30:59.977744 containerd[1690]: time="2026-01-23T18:30:59.977420990Z" level=info msg="connecting to shim 8831e657ddd84fc7ba2baeb73c73775c8e601f14b0ab3ef07a6b00a2b828ff2a" address="unix:///run/containerd/s/2a94e43ab1d84b9c3061f164360c13e4c1ce36aaab49f4052744631eb807127d" namespace=k8s.io protocol=ttrpc version=3 Jan 23 18:31:00.009578 systemd[1]: Started cri-containerd-8831e657ddd84fc7ba2baeb73c73775c8e601f14b0ab3ef07a6b00a2b828ff2a.scope - libcontainer container 8831e657ddd84fc7ba2baeb73c73775c8e601f14b0ab3ef07a6b00a2b828ff2a. Jan 23 18:31:00.017116 kubelet[2894]: E0123 18:31:00.017067 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:00.017833 kubelet[2894]: W0123 18:31:00.017244 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:00.017833 kubelet[2894]: E0123 18:31:00.017322 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:00.018363 kubelet[2894]: E0123 18:31:00.018317 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:00.018363 kubelet[2894]: W0123 18:31:00.018330 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:00.019448 kubelet[2894]: E0123 18:31:00.019402 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:00.019448 kubelet[2894]: W0123 18:31:00.019416 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:00.020016 kubelet[2894]: E0123 18:31:00.019976 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:00.020016 kubelet[2894]: E0123 18:31:00.019992 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:00.020274 kubelet[2894]: E0123 18:31:00.020231 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:00.020274 kubelet[2894]: W0123 18:31:00.020247 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:00.020912 kubelet[2894]: E0123 18:31:00.020607 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:00.022552 kubelet[2894]: E0123 18:31:00.021533 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-j9dvq" podUID="e681e1b7-9935-4d75-8509-9acd7616e3d8" Jan 23 18:31:00.022896 kubelet[2894]: E0123 18:31:00.022306 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:00.023045 kubelet[2894]: W0123 18:31:00.023035 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:00.023183 kubelet[2894]: E0123 18:31:00.023091 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:00.023760 kubelet[2894]: E0123 18:31:00.023294 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:00.023919 kubelet[2894]: W0123 18:31:00.023910 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:00.023973 kubelet[2894]: E0123 18:31:00.023960 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:00.024648 kubelet[2894]: E0123 18:31:00.024591 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:00.024648 kubelet[2894]: W0123 18:31:00.024610 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:00.024648 kubelet[2894]: E0123 18:31:00.024620 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:00.025198 kubelet[2894]: E0123 18:31:00.025119 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:00.025198 kubelet[2894]: W0123 18:31:00.025130 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:00.025198 kubelet[2894]: E0123 18:31:00.025141 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:00.025555 kubelet[2894]: E0123 18:31:00.025534 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:00.025555 kubelet[2894]: W0123 18:31:00.025545 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:00.025555 kubelet[2894]: E0123 18:31:00.025554 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:00.032980 kubelet[2894]: E0123 18:31:00.032961 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:00.032980 kubelet[2894]: W0123 18:31:00.032978 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:00.033064 kubelet[2894]: E0123 18:31:00.032993 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:00.037254 kubelet[2894]: E0123 18:31:00.037195 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:00.037254 kubelet[2894]: W0123 18:31:00.037294 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:00.037254 kubelet[2894]: E0123 18:31:00.037311 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:00.054000 audit: BPF prog-id=151 op=LOAD Jan 23 18:31:00.056000 audit: BPF prog-id=152 op=LOAD Jan 23 18:31:00.058673 kernel: audit: type=1334 audit(1769193060.054:537): prog-id=151 op=LOAD Jan 23 18:31:00.058728 kernel: audit: type=1334 audit(1769193060.056:538): prog-id=152 op=LOAD Jan 23 18:31:00.056000 audit[3326]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=3315 pid=3326 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:00.066279 kernel: audit: type=1300 audit(1769193060.056:538): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=3315 pid=3326 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:00.056000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3838333165363537646464383466633762613262616562373363373337 Jan 23 18:31:00.073659 kernel: audit: type=1327 audit(1769193060.056:538): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3838333165363537646464383466633762613262616562373363373337 Jan 23 18:31:00.058000 audit: BPF prog-id=152 op=UNLOAD Jan 23 18:31:00.058000 audit[3326]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3315 pid=3326 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:00.058000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3838333165363537646464383466633762613262616562373363373337 Jan 23 18:31:00.058000 audit: BPF prog-id=153 op=LOAD Jan 23 18:31:00.058000 audit[3326]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=3315 pid=3326 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:00.058000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3838333165363537646464383466633762613262616562373363373337 Jan 23 18:31:00.058000 audit: BPF prog-id=154 op=LOAD Jan 23 18:31:00.058000 audit[3326]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=3315 pid=3326 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:00.058000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3838333165363537646464383466633762613262616562373363373337 Jan 23 18:31:00.058000 audit: BPF prog-id=154 op=UNLOAD Jan 23 18:31:00.058000 audit[3326]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3315 pid=3326 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:00.058000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3838333165363537646464383466633762613262616562373363373337 Jan 23 18:31:00.058000 audit: BPF prog-id=153 op=UNLOAD Jan 23 18:31:00.058000 audit[3326]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3315 pid=3326 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:00.058000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3838333165363537646464383466633762613262616562373363373337 Jan 23 18:31:00.058000 audit: BPF prog-id=155 op=LOAD Jan 23 18:31:00.058000 audit[3326]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=3315 pid=3326 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:00.058000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3838333165363537646464383466633762613262616562373363373337 Jan 23 18:31:00.075000 audit[3383]: NETFILTER_CFG table=filter:115 family=2 entries=21 op=nft_register_rule pid=3383 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:31:00.075000 audit[3383]: SYSCALL arch=c000003e syscall=46 success=yes exit=8224 a0=3 a1=7ffcc28b5160 a2=0 a3=7ffcc28b514c items=0 ppid=3032 pid=3383 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:00.075000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:31:00.078000 audit[3383]: NETFILTER_CFG table=nat:116 family=2 entries=12 op=nft_register_rule pid=3383 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:31:00.078000 audit[3383]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffcc28b5160 a2=0 a3=0 items=0 ppid=3032 pid=3383 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:00.078000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:31:00.113941 kubelet[2894]: E0123 18:31:00.113915 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:00.113941 kubelet[2894]: W0123 18:31:00.113934 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:00.114106 kubelet[2894]: E0123 18:31:00.113957 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:00.114149 kubelet[2894]: E0123 18:31:00.114117 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:00.114149 kubelet[2894]: W0123 18:31:00.114122 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:00.114149 kubelet[2894]: E0123 18:31:00.114130 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:00.114313 kubelet[2894]: E0123 18:31:00.114237 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:00.114313 kubelet[2894]: W0123 18:31:00.114242 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:00.114313 kubelet[2894]: E0123 18:31:00.114248 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:00.114429 kubelet[2894]: E0123 18:31:00.114419 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:00.114429 kubelet[2894]: W0123 18:31:00.114427 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:00.114497 kubelet[2894]: E0123 18:31:00.114434 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:00.114573 kubelet[2894]: E0123 18:31:00.114564 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:00.114573 kubelet[2894]: W0123 18:31:00.114572 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:00.114639 kubelet[2894]: E0123 18:31:00.114578 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:00.114700 kubelet[2894]: E0123 18:31:00.114678 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:00.114700 kubelet[2894]: W0123 18:31:00.114686 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:00.114700 kubelet[2894]: E0123 18:31:00.114693 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:00.114801 kubelet[2894]: E0123 18:31:00.114792 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:00.114801 kubelet[2894]: W0123 18:31:00.114800 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:00.114860 kubelet[2894]: E0123 18:31:00.114805 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:00.114912 kubelet[2894]: E0123 18:31:00.114904 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:00.114912 kubelet[2894]: W0123 18:31:00.114911 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:00.115023 kubelet[2894]: E0123 18:31:00.114917 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:00.115044 kubelet[2894]: E0123 18:31:00.115025 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:00.115044 kubelet[2894]: W0123 18:31:00.115030 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:00.115044 kubelet[2894]: E0123 18:31:00.115035 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:00.115201 kubelet[2894]: E0123 18:31:00.115123 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:00.115201 kubelet[2894]: W0123 18:31:00.115131 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:00.115201 kubelet[2894]: E0123 18:31:00.115136 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:00.115345 kubelet[2894]: E0123 18:31:00.115230 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:00.115345 kubelet[2894]: W0123 18:31:00.115235 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:00.115345 kubelet[2894]: E0123 18:31:00.115240 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:00.115588 kubelet[2894]: E0123 18:31:00.115519 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:00.115588 kubelet[2894]: W0123 18:31:00.115527 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:00.115588 kubelet[2894]: E0123 18:31:00.115539 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:00.115729 kubelet[2894]: E0123 18:31:00.115687 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:00.115729 kubelet[2894]: W0123 18:31:00.115694 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:00.115729 kubelet[2894]: E0123 18:31:00.115700 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:00.115848 kubelet[2894]: E0123 18:31:00.115798 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:00.115848 kubelet[2894]: W0123 18:31:00.115802 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:00.115848 kubelet[2894]: E0123 18:31:00.115808 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:00.115917 kubelet[2894]: E0123 18:31:00.115903 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:00.115917 kubelet[2894]: W0123 18:31:00.115908 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:00.115917 kubelet[2894]: E0123 18:31:00.115915 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:00.116329 kubelet[2894]: E0123 18:31:00.116009 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:00.116329 kubelet[2894]: W0123 18:31:00.116016 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:00.116329 kubelet[2894]: E0123 18:31:00.116021 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:00.116329 kubelet[2894]: E0123 18:31:00.116143 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:00.116329 kubelet[2894]: W0123 18:31:00.116148 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:00.116329 kubelet[2894]: E0123 18:31:00.116153 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:00.116329 kubelet[2894]: E0123 18:31:00.116244 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:00.116329 kubelet[2894]: W0123 18:31:00.116248 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:00.116329 kubelet[2894]: E0123 18:31:00.116253 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:00.116525 containerd[1690]: time="2026-01-23T18:31:00.116234862Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-788cb4b5fc-w4bjl,Uid:456ee9f9-f37c-4b2f-bc1c-fb374e84b8d8,Namespace:calico-system,Attempt:0,} returns sandbox id \"8831e657ddd84fc7ba2baeb73c73775c8e601f14b0ab3ef07a6b00a2b828ff2a\"" Jan 23 18:31:00.116556 kubelet[2894]: E0123 18:31:00.116361 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:00.116556 kubelet[2894]: W0123 18:31:00.116365 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:00.116556 kubelet[2894]: E0123 18:31:00.116370 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:00.116556 kubelet[2894]: E0123 18:31:00.116462 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:00.116556 kubelet[2894]: W0123 18:31:00.116466 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:00.116556 kubelet[2894]: E0123 18:31:00.116472 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:00.116665 kubelet[2894]: E0123 18:31:00.116643 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:00.116665 kubelet[2894]: W0123 18:31:00.116648 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:00.116665 kubelet[2894]: E0123 18:31:00.116654 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:00.116724 kubelet[2894]: I0123 18:31:00.116679 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e681e1b7-9935-4d75-8509-9acd7616e3d8-socket-dir\") pod \"csi-node-driver-j9dvq\" (UID: \"e681e1b7-9935-4d75-8509-9acd7616e3d8\") " pod="calico-system/csi-node-driver-j9dvq" Jan 23 18:31:00.117364 kubelet[2894]: E0123 18:31:00.116801 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:00.117364 kubelet[2894]: W0123 18:31:00.116811 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:00.117364 kubelet[2894]: E0123 18:31:00.116818 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:00.117364 kubelet[2894]: I0123 18:31:00.116831 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cprcl\" (UniqueName: \"kubernetes.io/projected/e681e1b7-9935-4d75-8509-9acd7616e3d8-kube-api-access-cprcl\") pod \"csi-node-driver-j9dvq\" (UID: \"e681e1b7-9935-4d75-8509-9acd7616e3d8\") " pod="calico-system/csi-node-driver-j9dvq" Jan 23 18:31:00.117364 kubelet[2894]: E0123 18:31:00.116952 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:00.117364 kubelet[2894]: W0123 18:31:00.116958 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:00.117364 kubelet[2894]: E0123 18:31:00.116964 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:00.117364 kubelet[2894]: I0123 18:31:00.116975 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/e681e1b7-9935-4d75-8509-9acd7616e3d8-varrun\") pod \"csi-node-driver-j9dvq\" (UID: \"e681e1b7-9935-4d75-8509-9acd7616e3d8\") " pod="calico-system/csi-node-driver-j9dvq" Jan 23 18:31:00.117364 kubelet[2894]: E0123 18:31:00.117132 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:00.117578 kubelet[2894]: W0123 18:31:00.117142 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:00.117578 kubelet[2894]: E0123 18:31:00.117159 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:00.117808 kubelet[2894]: E0123 18:31:00.117713 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:00.117808 kubelet[2894]: W0123 18:31:00.117722 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:00.117808 kubelet[2894]: E0123 18:31:00.117733 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:00.117974 kubelet[2894]: E0123 18:31:00.117908 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:00.117974 kubelet[2894]: W0123 18:31:00.117914 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:00.117974 kubelet[2894]: E0123 18:31:00.117920 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:00.118215 kubelet[2894]: E0123 18:31:00.118208 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:00.118405 kubelet[2894]: W0123 18:31:00.118338 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:00.118547 kubelet[2894]: E0123 18:31:00.118540 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:00.118580 kubelet[2894]: W0123 18:31:00.118575 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:00.118899 kubelet[2894]: E0123 18:31:00.118797 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:00.119106 kubelet[2894]: W0123 18:31:00.119029 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:00.119106 kubelet[2894]: E0123 18:31:00.119043 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:00.119106 kubelet[2894]: E0123 18:31:00.118840 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:00.119106 kubelet[2894]: I0123 18:31:00.119064 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e681e1b7-9935-4d75-8509-9acd7616e3d8-registration-dir\") pod \"csi-node-driver-j9dvq\" (UID: \"e681e1b7-9935-4d75-8509-9acd7616e3d8\") " pod="calico-system/csi-node-driver-j9dvq" Jan 23 18:31:00.119106 kubelet[2894]: E0123 18:31:00.118833 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:00.119315 containerd[1690]: time="2026-01-23T18:31:00.119300380Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\"" Jan 23 18:31:00.119463 kubelet[2894]: E0123 18:31:00.119451 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:00.119463 kubelet[2894]: W0123 18:31:00.119462 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:00.119512 kubelet[2894]: E0123 18:31:00.119475 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:00.119733 kubelet[2894]: E0123 18:31:00.119723 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:00.119733 kubelet[2894]: W0123 18:31:00.119731 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:00.119780 kubelet[2894]: E0123 18:31:00.119743 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:00.119780 kubelet[2894]: I0123 18:31:00.119758 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e681e1b7-9935-4d75-8509-9acd7616e3d8-kubelet-dir\") pod \"csi-node-driver-j9dvq\" (UID: \"e681e1b7-9935-4d75-8509-9acd7616e3d8\") " pod="calico-system/csi-node-driver-j9dvq" Jan 23 18:31:00.120197 kubelet[2894]: E0123 18:31:00.120184 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:00.120197 kubelet[2894]: W0123 18:31:00.120195 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:00.120319 kubelet[2894]: E0123 18:31:00.120206 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:00.120918 kubelet[2894]: E0123 18:31:00.120667 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:00.120918 kubelet[2894]: W0123 18:31:00.120678 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:00.120918 kubelet[2894]: E0123 18:31:00.120708 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:00.121327 kubelet[2894]: E0123 18:31:00.121050 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:00.121327 kubelet[2894]: W0123 18:31:00.121061 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:00.121327 kubelet[2894]: E0123 18:31:00.121070 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:00.121327 kubelet[2894]: E0123 18:31:00.121242 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:00.121327 kubelet[2894]: W0123 18:31:00.121248 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:00.121327 kubelet[2894]: E0123 18:31:00.121255 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:00.131121 containerd[1690]: time="2026-01-23T18:31:00.130832906Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-468qx,Uid:d56a2bd8-afdd-445c-8c0d-125ed8e25d42,Namespace:calico-system,Attempt:0,}" Jan 23 18:31:00.158944 containerd[1690]: time="2026-01-23T18:31:00.158906428Z" level=info msg="connecting to shim 1efb2b4a99403353a650273a41490376508ac426ba9b8b560e8d4dff6ab6fe13" address="unix:///run/containerd/s/58f7af9f937b6c7ae459f14b71334f9d99d0ffa40e559799802a1193eeb75e63" namespace=k8s.io protocol=ttrpc version=3 Jan 23 18:31:00.197494 systemd[1]: Started cri-containerd-1efb2b4a99403353a650273a41490376508ac426ba9b8b560e8d4dff6ab6fe13.scope - libcontainer container 1efb2b4a99403353a650273a41490376508ac426ba9b8b560e8d4dff6ab6fe13. Jan 23 18:31:00.206000 audit: BPF prog-id=156 op=LOAD Jan 23 18:31:00.206000 audit: BPF prog-id=157 op=LOAD Jan 23 18:31:00.206000 audit[3449]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000128238 a2=98 a3=0 items=0 ppid=3437 pid=3449 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:00.206000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3165666232623461393934303333353361363530323733613431343930 Jan 23 18:31:00.206000 audit: BPF prog-id=157 op=UNLOAD Jan 23 18:31:00.206000 audit[3449]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3437 pid=3449 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:00.206000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3165666232623461393934303333353361363530323733613431343930 Jan 23 18:31:00.206000 audit: BPF prog-id=158 op=LOAD Jan 23 18:31:00.206000 audit[3449]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000128488 a2=98 a3=0 items=0 ppid=3437 pid=3449 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:00.206000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3165666232623461393934303333353361363530323733613431343930 Jan 23 18:31:00.206000 audit: BPF prog-id=159 op=LOAD Jan 23 18:31:00.206000 audit[3449]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000128218 a2=98 a3=0 items=0 ppid=3437 pid=3449 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:00.206000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3165666232623461393934303333353361363530323733613431343930 Jan 23 18:31:00.206000 audit: BPF prog-id=159 op=UNLOAD Jan 23 18:31:00.206000 audit[3449]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3437 pid=3449 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:00.206000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3165666232623461393934303333353361363530323733613431343930 Jan 23 18:31:00.206000 audit: BPF prog-id=158 op=UNLOAD Jan 23 18:31:00.206000 audit[3449]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3437 pid=3449 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:00.206000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3165666232623461393934303333353361363530323733613431343930 Jan 23 18:31:00.206000 audit: BPF prog-id=160 op=LOAD Jan 23 18:31:00.206000 audit[3449]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001286e8 a2=98 a3=0 items=0 ppid=3437 pid=3449 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:00.206000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3165666232623461393934303333353361363530323733613431343930 Jan 23 18:31:00.221820 kubelet[2894]: E0123 18:31:00.221495 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:00.221820 kubelet[2894]: W0123 18:31:00.221510 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:00.221820 kubelet[2894]: E0123 18:31:00.221528 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:00.222075 containerd[1690]: time="2026-01-23T18:31:00.221440476Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-468qx,Uid:d56a2bd8-afdd-445c-8c0d-125ed8e25d42,Namespace:calico-system,Attempt:0,} returns sandbox id \"1efb2b4a99403353a650273a41490376508ac426ba9b8b560e8d4dff6ab6fe13\"" Jan 23 18:31:00.222400 kubelet[2894]: E0123 18:31:00.222364 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:00.222933 kubelet[2894]: W0123 18:31:00.222834 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:00.222933 kubelet[2894]: E0123 18:31:00.222860 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:00.224012 kubelet[2894]: E0123 18:31:00.224001 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:00.224217 kubelet[2894]: W0123 18:31:00.224068 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:00.224217 kubelet[2894]: E0123 18:31:00.224082 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:00.224921 kubelet[2894]: E0123 18:31:00.224522 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:00.224921 kubelet[2894]: W0123 18:31:00.224534 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:00.224921 kubelet[2894]: E0123 18:31:00.224625 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:00.225328 kubelet[2894]: E0123 18:31:00.225311 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:00.225328 kubelet[2894]: W0123 18:31:00.225323 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:00.225906 kubelet[2894]: E0123 18:31:00.225884 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:00.226100 kubelet[2894]: E0123 18:31:00.226089 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:00.226214 kubelet[2894]: W0123 18:31:00.226145 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:00.226319 kubelet[2894]: E0123 18:31:00.226303 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:00.226495 kubelet[2894]: E0123 18:31:00.226488 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:00.226582 kubelet[2894]: W0123 18:31:00.226529 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:00.226676 kubelet[2894]: E0123 18:31:00.226671 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:00.226761 kubelet[2894]: W0123 18:31:00.226712 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:00.226848 kubelet[2894]: E0123 18:31:00.226844 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:00.226877 kubelet[2894]: W0123 18:31:00.226873 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:00.227172 kubelet[2894]: E0123 18:31:00.227002 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:00.227172 kubelet[2894]: W0123 18:31:00.227008 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:00.227454 kubelet[2894]: E0123 18:31:00.227448 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:00.227489 kubelet[2894]: W0123 18:31:00.227485 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:00.227531 kubelet[2894]: E0123 18:31:00.227525 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:00.227654 kubelet[2894]: E0123 18:31:00.227649 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:00.227776 kubelet[2894]: W0123 18:31:00.227679 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:00.227776 kubelet[2894]: E0123 18:31:00.227687 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:00.227880 kubelet[2894]: E0123 18:31:00.227875 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:00.227911 kubelet[2894]: W0123 18:31:00.227906 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:00.227951 kubelet[2894]: E0123 18:31:00.227945 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:00.227993 kubelet[2894]: E0123 18:31:00.227988 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:00.228163 kubelet[2894]: E0123 18:31:00.228111 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:00.228163 kubelet[2894]: W0123 18:31:00.228116 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:00.228163 kubelet[2894]: E0123 18:31:00.228124 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:00.228430 kubelet[2894]: E0123 18:31:00.228397 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:00.228756 kubelet[2894]: W0123 18:31:00.228502 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:00.228756 kubelet[2894]: E0123 18:31:00.228513 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:00.229045 kubelet[2894]: E0123 18:31:00.228896 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:00.229342 kubelet[2894]: W0123 18:31:00.229097 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:00.229342 kubelet[2894]: E0123 18:31:00.229110 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:00.229945 kubelet[2894]: E0123 18:31:00.229934 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:00.230202 kubelet[2894]: E0123 18:31:00.230195 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:00.230267 kubelet[2894]: W0123 18:31:00.230249 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:00.230551 kubelet[2894]: E0123 18:31:00.230540 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:00.230824 kubelet[2894]: E0123 18:31:00.230754 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:00.230824 kubelet[2894]: W0123 18:31:00.230762 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:00.230824 kubelet[2894]: E0123 18:31:00.230770 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:00.231181 kubelet[2894]: E0123 18:31:00.231175 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:00.231219 kubelet[2894]: W0123 18:31:00.231213 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:00.231283 kubelet[2894]: E0123 18:31:00.231250 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:00.231660 kubelet[2894]: E0123 18:31:00.231535 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:00.231660 kubelet[2894]: W0123 18:31:00.231547 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:00.231660 kubelet[2894]: E0123 18:31:00.231555 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:00.232274 kubelet[2894]: E0123 18:31:00.230212 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:00.232356 kubelet[2894]: E0123 18:31:00.232349 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:00.232390 kubelet[2894]: W0123 18:31:00.232384 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:00.232422 kubelet[2894]: E0123 18:31:00.232417 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:00.232588 kubelet[2894]: E0123 18:31:00.232582 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:00.232627 kubelet[2894]: W0123 18:31:00.232621 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:00.232666 kubelet[2894]: E0123 18:31:00.232660 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:00.232898 kubelet[2894]: E0123 18:31:00.230204 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:00.233708 kubelet[2894]: E0123 18:31:00.233160 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:00.233782 kubelet[2894]: W0123 18:31:00.233772 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:00.233819 kubelet[2894]: E0123 18:31:00.233813 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:00.233978 kubelet[2894]: E0123 18:31:00.233973 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:00.234025 kubelet[2894]: W0123 18:31:00.234019 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:00.234059 kubelet[2894]: E0123 18:31:00.234053 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:00.234794 kubelet[2894]: E0123 18:31:00.234771 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:00.234794 kubelet[2894]: W0123 18:31:00.234784 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:00.234794 kubelet[2894]: E0123 18:31:00.234795 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:00.240192 kubelet[2894]: E0123 18:31:00.240179 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:00.240192 kubelet[2894]: W0123 18:31:00.240190 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:00.240281 kubelet[2894]: E0123 18:31:00.240200 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:01.377633 kubelet[2894]: E0123 18:31:01.376838 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-j9dvq" podUID="e681e1b7-9935-4d75-8509-9acd7616e3d8" Jan 23 18:31:01.614938 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2895806073.mount: Deactivated successfully. Jan 23 18:31:02.765114 containerd[1690]: time="2026-01-23T18:31:02.764954048Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 18:31:02.766932 containerd[1690]: time="2026-01-23T18:31:02.766904898Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.4: active requests=0, bytes read=33735893" Jan 23 18:31:02.768276 containerd[1690]: time="2026-01-23T18:31:02.767949046Z" level=info msg="ImageCreate event name:\"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 18:31:02.769676 containerd[1690]: time="2026-01-23T18:31:02.769653675Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 18:31:02.770062 containerd[1690]: time="2026-01-23T18:31:02.770045919Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.4\" with image id \"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\", size \"35234482\" in 2.650682418s" Jan 23 18:31:02.770123 containerd[1690]: time="2026-01-23T18:31:02.770115010Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\" returns image reference \"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\"" Jan 23 18:31:02.771273 containerd[1690]: time="2026-01-23T18:31:02.771243414Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\"" Jan 23 18:31:02.802298 containerd[1690]: time="2026-01-23T18:31:02.802186820Z" level=info msg="CreateContainer within sandbox \"8831e657ddd84fc7ba2baeb73c73775c8e601f14b0ab3ef07a6b00a2b828ff2a\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Jan 23 18:31:02.818679 containerd[1690]: time="2026-01-23T18:31:02.818636705Z" level=info msg="Container 73d7b675ba8cbc411e069c556ca006cadc41c6ac92d404d9f74977c6d77fdb20: CDI devices from CRI Config.CDIDevices: []" Jan 23 18:31:02.827167 containerd[1690]: time="2026-01-23T18:31:02.827097326Z" level=info msg="CreateContainer within sandbox \"8831e657ddd84fc7ba2baeb73c73775c8e601f14b0ab3ef07a6b00a2b828ff2a\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"73d7b675ba8cbc411e069c556ca006cadc41c6ac92d404d9f74977c6d77fdb20\"" Jan 23 18:31:02.828200 containerd[1690]: time="2026-01-23T18:31:02.827844361Z" level=info msg="StartContainer for \"73d7b675ba8cbc411e069c556ca006cadc41c6ac92d404d9f74977c6d77fdb20\"" Jan 23 18:31:02.829270 containerd[1690]: time="2026-01-23T18:31:02.829238949Z" level=info msg="connecting to shim 73d7b675ba8cbc411e069c556ca006cadc41c6ac92d404d9f74977c6d77fdb20" address="unix:///run/containerd/s/2a94e43ab1d84b9c3061f164360c13e4c1ce36aaab49f4052744631eb807127d" protocol=ttrpc version=3 Jan 23 18:31:02.849435 systemd[1]: Started cri-containerd-73d7b675ba8cbc411e069c556ca006cadc41c6ac92d404d9f74977c6d77fdb20.scope - libcontainer container 73d7b675ba8cbc411e069c556ca006cadc41c6ac92d404d9f74977c6d77fdb20. Jan 23 18:31:02.861000 audit: BPF prog-id=161 op=LOAD Jan 23 18:31:02.861000 audit: BPF prog-id=162 op=LOAD Jan 23 18:31:02.861000 audit[3512]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=3315 pid=3512 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:02.861000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3733643762363735626138636263343131653036396335353663613030 Jan 23 18:31:02.861000 audit: BPF prog-id=162 op=UNLOAD Jan 23 18:31:02.861000 audit[3512]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3315 pid=3512 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:02.861000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3733643762363735626138636263343131653036396335353663613030 Jan 23 18:31:02.861000 audit: BPF prog-id=163 op=LOAD Jan 23 18:31:02.861000 audit[3512]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=3315 pid=3512 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:02.861000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3733643762363735626138636263343131653036396335353663613030 Jan 23 18:31:02.861000 audit: BPF prog-id=164 op=LOAD Jan 23 18:31:02.861000 audit[3512]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=3315 pid=3512 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:02.861000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3733643762363735626138636263343131653036396335353663613030 Jan 23 18:31:02.861000 audit: BPF prog-id=164 op=UNLOAD Jan 23 18:31:02.861000 audit[3512]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3315 pid=3512 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:02.861000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3733643762363735626138636263343131653036396335353663613030 Jan 23 18:31:02.861000 audit: BPF prog-id=163 op=UNLOAD Jan 23 18:31:02.861000 audit[3512]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3315 pid=3512 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:02.861000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3733643762363735626138636263343131653036396335353663613030 Jan 23 18:31:02.861000 audit: BPF prog-id=165 op=LOAD Jan 23 18:31:02.861000 audit[3512]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=3315 pid=3512 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:02.861000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3733643762363735626138636263343131653036396335353663613030 Jan 23 18:31:02.908974 containerd[1690]: time="2026-01-23T18:31:02.908242337Z" level=info msg="StartContainer for \"73d7b675ba8cbc411e069c556ca006cadc41c6ac92d404d9f74977c6d77fdb20\" returns successfully" Jan 23 18:31:03.378034 kubelet[2894]: E0123 18:31:03.377996 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-j9dvq" podUID="e681e1b7-9935-4d75-8509-9acd7616e3d8" Jan 23 18:31:03.467431 kubelet[2894]: I0123 18:31:03.467385 2894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-788cb4b5fc-w4bjl" podStartSLOduration=1.81483069 podStartE2EDuration="4.467370837s" podCreationTimestamp="2026-01-23 18:30:59 +0000 UTC" firstStartedPulling="2026-01-23 18:31:00.118367915 +0000 UTC m=+22.825703459" lastFinishedPulling="2026-01-23 18:31:02.770908055 +0000 UTC m=+25.478243606" observedRunningTime="2026-01-23 18:31:03.467123873 +0000 UTC m=+26.174459435" watchObservedRunningTime="2026-01-23 18:31:03.467370837 +0000 UTC m=+26.174706379" Jan 23 18:31:03.535890 kubelet[2894]: E0123 18:31:03.535719 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:03.535890 kubelet[2894]: W0123 18:31:03.535740 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:03.535890 kubelet[2894]: E0123 18:31:03.535759 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:03.536151 kubelet[2894]: E0123 18:31:03.535956 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:03.536151 kubelet[2894]: W0123 18:31:03.535964 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:03.536151 kubelet[2894]: E0123 18:31:03.535975 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:03.536151 kubelet[2894]: E0123 18:31:03.536104 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:03.536151 kubelet[2894]: W0123 18:31:03.536109 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:03.536288 kubelet[2894]: E0123 18:31:03.536207 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:03.536517 kubelet[2894]: E0123 18:31:03.536491 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:03.536517 kubelet[2894]: W0123 18:31:03.536501 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:03.536517 kubelet[2894]: E0123 18:31:03.536510 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:03.537637 kubelet[2894]: E0123 18:31:03.536690 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:03.537637 kubelet[2894]: W0123 18:31:03.536696 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:03.537637 kubelet[2894]: E0123 18:31:03.536703 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:03.537637 kubelet[2894]: E0123 18:31:03.537107 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:03.537637 kubelet[2894]: W0123 18:31:03.537115 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:03.537637 kubelet[2894]: E0123 18:31:03.537126 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:03.537831 kubelet[2894]: E0123 18:31:03.537765 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:03.537831 kubelet[2894]: W0123 18:31:03.537775 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:03.537831 kubelet[2894]: E0123 18:31:03.537784 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:03.538342 kubelet[2894]: E0123 18:31:03.538332 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:03.538372 kubelet[2894]: W0123 18:31:03.538343 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:03.538372 kubelet[2894]: E0123 18:31:03.538352 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:03.538534 kubelet[2894]: E0123 18:31:03.538526 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:03.538557 kubelet[2894]: W0123 18:31:03.538535 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:03.538557 kubelet[2894]: E0123 18:31:03.538543 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:03.538679 kubelet[2894]: E0123 18:31:03.538671 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:03.538679 kubelet[2894]: W0123 18:31:03.538679 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:03.538724 kubelet[2894]: E0123 18:31:03.538686 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:03.539076 kubelet[2894]: E0123 18:31:03.539065 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:03.539100 kubelet[2894]: W0123 18:31:03.539076 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:03.539100 kubelet[2894]: E0123 18:31:03.539084 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:03.539235 kubelet[2894]: E0123 18:31:03.539212 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:03.539235 kubelet[2894]: W0123 18:31:03.539217 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:03.539235 kubelet[2894]: E0123 18:31:03.539223 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:03.539455 kubelet[2894]: E0123 18:31:03.539364 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:03.539455 kubelet[2894]: W0123 18:31:03.539373 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:03.539455 kubelet[2894]: E0123 18:31:03.539378 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:03.539617 kubelet[2894]: E0123 18:31:03.539559 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:03.539617 kubelet[2894]: W0123 18:31:03.539570 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:03.539617 kubelet[2894]: E0123 18:31:03.539577 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:03.539762 kubelet[2894]: E0123 18:31:03.539682 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:03.539762 kubelet[2894]: W0123 18:31:03.539691 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:03.539762 kubelet[2894]: E0123 18:31:03.539697 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:03.554045 kubelet[2894]: E0123 18:31:03.554030 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:03.554272 kubelet[2894]: W0123 18:31:03.554136 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:03.554272 kubelet[2894]: E0123 18:31:03.554151 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:03.554453 kubelet[2894]: E0123 18:31:03.554445 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:03.554580 kubelet[2894]: W0123 18:31:03.554488 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:03.554580 kubelet[2894]: E0123 18:31:03.554500 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:03.554668 kubelet[2894]: E0123 18:31:03.554662 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:03.554701 kubelet[2894]: W0123 18:31:03.554696 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:03.554749 kubelet[2894]: E0123 18:31:03.554742 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:03.554913 kubelet[2894]: E0123 18:31:03.554907 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:03.555012 kubelet[2894]: W0123 18:31:03.554968 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:03.555012 kubelet[2894]: E0123 18:31:03.554980 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:03.555181 kubelet[2894]: E0123 18:31:03.555175 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:03.555227 kubelet[2894]: W0123 18:31:03.555211 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:03.555269 kubelet[2894]: E0123 18:31:03.555253 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:03.555411 kubelet[2894]: E0123 18:31:03.555397 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:03.555436 kubelet[2894]: W0123 18:31:03.555413 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:03.555436 kubelet[2894]: E0123 18:31:03.555429 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:03.555554 kubelet[2894]: E0123 18:31:03.555544 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:03.555554 kubelet[2894]: W0123 18:31:03.555551 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:03.555612 kubelet[2894]: E0123 18:31:03.555562 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:03.555669 kubelet[2894]: E0123 18:31:03.555661 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:03.555669 kubelet[2894]: W0123 18:31:03.555668 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:03.555731 kubelet[2894]: E0123 18:31:03.555677 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:03.555807 kubelet[2894]: E0123 18:31:03.555800 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:03.555842 kubelet[2894]: W0123 18:31:03.555807 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:03.555842 kubelet[2894]: E0123 18:31:03.555817 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:03.556067 kubelet[2894]: E0123 18:31:03.556016 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:03.556067 kubelet[2894]: W0123 18:31:03.556024 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:03.556067 kubelet[2894]: E0123 18:31:03.556032 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:03.556274 kubelet[2894]: E0123 18:31:03.556242 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:03.556274 kubelet[2894]: W0123 18:31:03.556249 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:03.556376 kubelet[2894]: E0123 18:31:03.556325 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:03.556558 kubelet[2894]: E0123 18:31:03.556500 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:03.556558 kubelet[2894]: W0123 18:31:03.556506 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:03.556558 kubelet[2894]: E0123 18:31:03.556516 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:03.556735 kubelet[2894]: E0123 18:31:03.556729 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:03.556787 kubelet[2894]: W0123 18:31:03.556773 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:03.556828 kubelet[2894]: E0123 18:31:03.556821 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:03.557008 kubelet[2894]: E0123 18:31:03.556998 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:03.557032 kubelet[2894]: W0123 18:31:03.557008 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:03.557032 kubelet[2894]: E0123 18:31:03.557016 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:03.557134 kubelet[2894]: E0123 18:31:03.557126 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:03.557134 kubelet[2894]: W0123 18:31:03.557133 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:03.557193 kubelet[2894]: E0123 18:31:03.557143 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:03.557275 kubelet[2894]: E0123 18:31:03.557269 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:03.557298 kubelet[2894]: W0123 18:31:03.557275 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:03.557298 kubelet[2894]: E0123 18:31:03.557285 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:03.557570 kubelet[2894]: E0123 18:31:03.557534 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:03.557570 kubelet[2894]: W0123 18:31:03.557542 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:03.557570 kubelet[2894]: E0123 18:31:03.557553 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:03.557674 kubelet[2894]: E0123 18:31:03.557665 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:03.557696 kubelet[2894]: W0123 18:31:03.557674 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:03.557696 kubelet[2894]: E0123 18:31:03.557681 2894 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:04.343740 containerd[1690]: time="2026-01-23T18:31:04.343280444Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 18:31:04.344362 containerd[1690]: time="2026-01-23T18:31:04.344304811Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4: active requests=0, bytes read=0" Jan 23 18:31:04.345292 containerd[1690]: time="2026-01-23T18:31:04.345271976Z" level=info msg="ImageCreate event name:\"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 18:31:04.347485 containerd[1690]: time="2026-01-23T18:31:04.347459025Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 18:31:04.348249 containerd[1690]: time="2026-01-23T18:31:04.348021061Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" with image id \"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\", size \"5941314\" in 1.575670994s" Jan 23 18:31:04.348249 containerd[1690]: time="2026-01-23T18:31:04.348048310Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" returns image reference \"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\"" Jan 23 18:31:04.351095 containerd[1690]: time="2026-01-23T18:31:04.350901410Z" level=info msg="CreateContainer within sandbox \"1efb2b4a99403353a650273a41490376508ac426ba9b8b560e8d4dff6ab6fe13\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Jan 23 18:31:04.361784 containerd[1690]: time="2026-01-23T18:31:04.361753982Z" level=info msg="Container 2f717f6aafbdfccc0d53f30eb2a0746eaac438353f554a6e76a61c6d0d867df6: CDI devices from CRI Config.CDIDevices: []" Jan 23 18:31:04.365636 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1624450416.mount: Deactivated successfully. Jan 23 18:31:04.370511 containerd[1690]: time="2026-01-23T18:31:04.370473241Z" level=info msg="CreateContainer within sandbox \"1efb2b4a99403353a650273a41490376508ac426ba9b8b560e8d4dff6ab6fe13\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"2f717f6aafbdfccc0d53f30eb2a0746eaac438353f554a6e76a61c6d0d867df6\"" Jan 23 18:31:04.371364 containerd[1690]: time="2026-01-23T18:31:04.371336071Z" level=info msg="StartContainer for \"2f717f6aafbdfccc0d53f30eb2a0746eaac438353f554a6e76a61c6d0d867df6\"" Jan 23 18:31:04.373036 containerd[1690]: time="2026-01-23T18:31:04.372990767Z" level=info msg="connecting to shim 2f717f6aafbdfccc0d53f30eb2a0746eaac438353f554a6e76a61c6d0d867df6" address="unix:///run/containerd/s/58f7af9f937b6c7ae459f14b71334f9d99d0ffa40e559799802a1193eeb75e63" protocol=ttrpc version=3 Jan 23 18:31:04.398621 systemd[1]: Started cri-containerd-2f717f6aafbdfccc0d53f30eb2a0746eaac438353f554a6e76a61c6d0d867df6.scope - libcontainer container 2f717f6aafbdfccc0d53f30eb2a0746eaac438353f554a6e76a61c6d0d867df6. Jan 23 18:31:04.437000 audit: BPF prog-id=166 op=LOAD Jan 23 18:31:04.439971 kernel: kauditd_printk_skb: 68 callbacks suppressed Jan 23 18:31:04.440023 kernel: audit: type=1334 audit(1769193064.437:563): prog-id=166 op=LOAD Jan 23 18:31:04.437000 audit[3590]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=3437 pid=3590 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:04.443510 kernel: audit: type=1300 audit(1769193064.437:563): arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=3437 pid=3590 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:04.437000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3266373137663661616662646663636330643533663330656232613037 Jan 23 18:31:04.447498 kernel: audit: type=1327 audit(1769193064.437:563): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3266373137663661616662646663636330643533663330656232613037 Jan 23 18:31:04.437000 audit: BPF prog-id=167 op=LOAD Jan 23 18:31:04.450413 kernel: audit: type=1334 audit(1769193064.437:564): prog-id=167 op=LOAD Jan 23 18:31:04.437000 audit[3590]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=3437 pid=3590 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:04.437000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3266373137663661616662646663636330643533663330656232613037 Jan 23 18:31:04.457714 kernel: audit: type=1300 audit(1769193064.437:564): arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=3437 pid=3590 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:04.457760 kernel: audit: type=1327 audit(1769193064.437:564): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3266373137663661616662646663636330643533663330656232613037 Jan 23 18:31:04.437000 audit: BPF prog-id=167 op=UNLOAD Jan 23 18:31:04.460790 kernel: audit: type=1334 audit(1769193064.437:565): prog-id=167 op=UNLOAD Jan 23 18:31:04.437000 audit[3590]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3437 pid=3590 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:04.463349 kernel: audit: type=1300 audit(1769193064.437:565): arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3437 pid=3590 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:04.437000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3266373137663661616662646663636330643533663330656232613037 Jan 23 18:31:04.467306 kernel: audit: type=1327 audit(1769193064.437:565): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3266373137663661616662646663636330643533663330656232613037 Jan 23 18:31:04.469650 kubelet[2894]: I0123 18:31:04.469630 2894 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 23 18:31:04.437000 audit: BPF prog-id=166 op=UNLOAD Jan 23 18:31:04.470599 kernel: audit: type=1334 audit(1769193064.437:566): prog-id=166 op=UNLOAD Jan 23 18:31:04.437000 audit[3590]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3437 pid=3590 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:04.437000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3266373137663661616662646663636330643533663330656232613037 Jan 23 18:31:04.437000 audit: BPF prog-id=168 op=LOAD Jan 23 18:31:04.437000 audit[3590]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=3437 pid=3590 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:04.437000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3266373137663661616662646663636330643533663330656232613037 Jan 23 18:31:04.478778 containerd[1690]: time="2026-01-23T18:31:04.478667390Z" level=info msg="StartContainer for \"2f717f6aafbdfccc0d53f30eb2a0746eaac438353f554a6e76a61c6d0d867df6\" returns successfully" Jan 23 18:31:04.488046 systemd[1]: cri-containerd-2f717f6aafbdfccc0d53f30eb2a0746eaac438353f554a6e76a61c6d0d867df6.scope: Deactivated successfully. Jan 23 18:31:04.490834 containerd[1690]: time="2026-01-23T18:31:04.490807162Z" level=info msg="received container exit event container_id:\"2f717f6aafbdfccc0d53f30eb2a0746eaac438353f554a6e76a61c6d0d867df6\" id:\"2f717f6aafbdfccc0d53f30eb2a0746eaac438353f554a6e76a61c6d0d867df6\" pid:3602 exited_at:{seconds:1769193064 nanos:490466662}" Jan 23 18:31:04.490000 audit: BPF prog-id=168 op=UNLOAD Jan 23 18:31:04.511331 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-2f717f6aafbdfccc0d53f30eb2a0746eaac438353f554a6e76a61c6d0d867df6-rootfs.mount: Deactivated successfully. Jan 23 18:31:05.376664 kubelet[2894]: E0123 18:31:05.376607 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-j9dvq" podUID="e681e1b7-9935-4d75-8509-9acd7616e3d8" Jan 23 18:31:06.477443 containerd[1690]: time="2026-01-23T18:31:06.477251082Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\"" Jan 23 18:31:07.377021 kubelet[2894]: E0123 18:31:07.376650 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-j9dvq" podUID="e681e1b7-9935-4d75-8509-9acd7616e3d8" Jan 23 18:31:09.378047 kubelet[2894]: E0123 18:31:09.376469 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-j9dvq" podUID="e681e1b7-9935-4d75-8509-9acd7616e3d8" Jan 23 18:31:10.786805 containerd[1690]: time="2026-01-23T18:31:10.786759599Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 18:31:10.787593 containerd[1690]: time="2026-01-23T18:31:10.787481489Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.4: active requests=0, bytes read=70442291" Jan 23 18:31:10.788546 containerd[1690]: time="2026-01-23T18:31:10.788527630Z" level=info msg="ImageCreate event name:\"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 18:31:10.791217 containerd[1690]: time="2026-01-23T18:31:10.791175492Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 18:31:10.791994 containerd[1690]: time="2026-01-23T18:31:10.791978135Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.4\" with image id \"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\", size \"71941459\" in 4.314683946s" Jan 23 18:31:10.792119 containerd[1690]: time="2026-01-23T18:31:10.792049760Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\" returns image reference \"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\"" Jan 23 18:31:10.794544 containerd[1690]: time="2026-01-23T18:31:10.794523061Z" level=info msg="CreateContainer within sandbox \"1efb2b4a99403353a650273a41490376508ac426ba9b8b560e8d4dff6ab6fe13\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Jan 23 18:31:10.804306 containerd[1690]: time="2026-01-23T18:31:10.804254110Z" level=info msg="Container 72129ce2353e17a3e8f194b63bd65e73033ac2788e0f5767e39766d30f86b2d5: CDI devices from CRI Config.CDIDevices: []" Jan 23 18:31:10.823280 containerd[1690]: time="2026-01-23T18:31:10.823202077Z" level=info msg="CreateContainer within sandbox \"1efb2b4a99403353a650273a41490376508ac426ba9b8b560e8d4dff6ab6fe13\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"72129ce2353e17a3e8f194b63bd65e73033ac2788e0f5767e39766d30f86b2d5\"" Jan 23 18:31:10.823839 containerd[1690]: time="2026-01-23T18:31:10.823817688Z" level=info msg="StartContainer for \"72129ce2353e17a3e8f194b63bd65e73033ac2788e0f5767e39766d30f86b2d5\"" Jan 23 18:31:10.825522 containerd[1690]: time="2026-01-23T18:31:10.825482268Z" level=info msg="connecting to shim 72129ce2353e17a3e8f194b63bd65e73033ac2788e0f5767e39766d30f86b2d5" address="unix:///run/containerd/s/58f7af9f937b6c7ae459f14b71334f9d99d0ffa40e559799802a1193eeb75e63" protocol=ttrpc version=3 Jan 23 18:31:10.848484 systemd[1]: Started cri-containerd-72129ce2353e17a3e8f194b63bd65e73033ac2788e0f5767e39766d30f86b2d5.scope - libcontainer container 72129ce2353e17a3e8f194b63bd65e73033ac2788e0f5767e39766d30f86b2d5. Jan 23 18:31:10.907461 kernel: kauditd_printk_skb: 6 callbacks suppressed Jan 23 18:31:10.907578 kernel: audit: type=1334 audit(1769193070.904:569): prog-id=169 op=LOAD Jan 23 18:31:10.904000 audit: BPF prog-id=169 op=LOAD Jan 23 18:31:10.904000 audit[3646]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=3437 pid=3646 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:10.909858 kernel: audit: type=1300 audit(1769193070.904:569): arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=3437 pid=3646 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:10.904000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3732313239636532333533653137613365386631393462363362643635 Jan 23 18:31:10.914150 kernel: audit: type=1327 audit(1769193070.904:569): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3732313239636532333533653137613365386631393462363362643635 Jan 23 18:31:10.904000 audit: BPF prog-id=170 op=LOAD Jan 23 18:31:10.917284 kernel: audit: type=1334 audit(1769193070.904:570): prog-id=170 op=LOAD Jan 23 18:31:10.904000 audit[3646]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=3437 pid=3646 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:10.919701 kernel: audit: type=1300 audit(1769193070.904:570): arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=3437 pid=3646 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:10.904000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3732313239636532333533653137613365386631393462363362643635 Jan 23 18:31:10.923874 kernel: audit: type=1327 audit(1769193070.904:570): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3732313239636532333533653137613365386631393462363362643635 Jan 23 18:31:10.904000 audit: BPF prog-id=170 op=UNLOAD Jan 23 18:31:10.928280 kernel: audit: type=1334 audit(1769193070.904:571): prog-id=170 op=UNLOAD Jan 23 18:31:10.904000 audit[3646]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3437 pid=3646 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:10.904000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3732313239636532333533653137613365386631393462363362643635 Jan 23 18:31:10.934621 kernel: audit: type=1300 audit(1769193070.904:571): arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3437 pid=3646 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:10.934677 kernel: audit: type=1327 audit(1769193070.904:571): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3732313239636532333533653137613365386631393462363362643635 Jan 23 18:31:10.904000 audit: BPF prog-id=169 op=UNLOAD Jan 23 18:31:10.937952 kernel: audit: type=1334 audit(1769193070.904:572): prog-id=169 op=UNLOAD Jan 23 18:31:10.904000 audit[3646]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3437 pid=3646 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:10.904000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3732313239636532333533653137613365386631393462363362643635 Jan 23 18:31:10.904000 audit: BPF prog-id=171 op=LOAD Jan 23 18:31:10.904000 audit[3646]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=3437 pid=3646 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:10.904000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3732313239636532333533653137613365386631393462363362643635 Jan 23 18:31:10.945289 containerd[1690]: time="2026-01-23T18:31:10.945251656Z" level=info msg="StartContainer for \"72129ce2353e17a3e8f194b63bd65e73033ac2788e0f5767e39766d30f86b2d5\" returns successfully" Jan 23 18:31:11.379317 kubelet[2894]: E0123 18:31:11.379285 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-j9dvq" podUID="e681e1b7-9935-4d75-8509-9acd7616e3d8" Jan 23 18:31:12.230723 kubelet[2894]: I0123 18:31:12.230586 2894 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 23 18:31:12.244760 containerd[1690]: time="2026-01-23T18:31:12.244314720Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jan 23 18:31:12.246730 systemd[1]: cri-containerd-72129ce2353e17a3e8f194b63bd65e73033ac2788e0f5767e39766d30f86b2d5.scope: Deactivated successfully. Jan 23 18:31:12.247604 systemd[1]: cri-containerd-72129ce2353e17a3e8f194b63bd65e73033ac2788e0f5767e39766d30f86b2d5.scope: Consumed 400ms CPU time, 191M memory peak, 171.3M written to disk. Jan 23 18:31:12.248000 audit: BPF prog-id=171 op=UNLOAD Jan 23 18:31:12.252083 containerd[1690]: time="2026-01-23T18:31:12.252058953Z" level=info msg="received container exit event container_id:\"72129ce2353e17a3e8f194b63bd65e73033ac2788e0f5767e39766d30f86b2d5\" id:\"72129ce2353e17a3e8f194b63bd65e73033ac2788e0f5767e39766d30f86b2d5\" pid:3659 exited_at:{seconds:1769193072 nanos:251624700}" Jan 23 18:31:12.274000 audit[3689]: NETFILTER_CFG table=filter:117 family=2 entries=21 op=nft_register_rule pid=3689 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:31:12.274000 audit[3689]: SYSCALL arch=c000003e syscall=46 success=yes exit=7480 a0=3 a1=7fff02a2b830 a2=0 a3=7fff02a2b81c items=0 ppid=3032 pid=3689 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:12.274000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:31:12.280000 audit[3689]: NETFILTER_CFG table=nat:118 family=2 entries=19 op=nft_register_chain pid=3689 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:31:12.280000 audit[3689]: SYSCALL arch=c000003e syscall=46 success=yes exit=6276 a0=3 a1=7fff02a2b830 a2=0 a3=7fff02a2b81c items=0 ppid=3032 pid=3689 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:12.280000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:31:12.283891 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-72129ce2353e17a3e8f194b63bd65e73033ac2788e0f5767e39766d30f86b2d5-rootfs.mount: Deactivated successfully. Jan 23 18:31:12.285345 kubelet[2894]: I0123 18:31:12.285300 2894 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Jan 23 18:31:12.319099 systemd[1]: Created slice kubepods-besteffort-pod644e9aff_fbfa_4d8d_bb83_94a0bb426243.slice - libcontainer container kubepods-besteffort-pod644e9aff_fbfa_4d8d_bb83_94a0bb426243.slice. Jan 23 18:31:12.329753 kubelet[2894]: I0123 18:31:12.329653 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5x8h6\" (UniqueName: \"kubernetes.io/projected/4b639c85-fa00-406e-8d74-df95ab4cd9fb-kube-api-access-5x8h6\") pod \"coredns-668d6bf9bc-cmmgk\" (UID: \"4b639c85-fa00-406e-8d74-df95ab4cd9fb\") " pod="kube-system/coredns-668d6bf9bc-cmmgk" Jan 23 18:31:12.329753 kubelet[2894]: I0123 18:31:12.329685 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4b639c85-fa00-406e-8d74-df95ab4cd9fb-config-volume\") pod \"coredns-668d6bf9bc-cmmgk\" (UID: \"4b639c85-fa00-406e-8d74-df95ab4cd9fb\") " pod="kube-system/coredns-668d6bf9bc-cmmgk" Jan 23 18:31:12.329753 kubelet[2894]: I0123 18:31:12.329713 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnptr\" (UniqueName: \"kubernetes.io/projected/644e9aff-fbfa-4d8d-bb83-94a0bb426243-kube-api-access-cnptr\") pod \"calico-apiserver-779b7ffd49-78hkk\" (UID: \"644e9aff-fbfa-4d8d-bb83-94a0bb426243\") " pod="calico-apiserver/calico-apiserver-779b7ffd49-78hkk" Jan 23 18:31:12.329753 kubelet[2894]: I0123 18:31:12.329740 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9d67030a-5092-4383-b1de-ac0c48bff4df-config-volume\") pod \"coredns-668d6bf9bc-fpt6r\" (UID: \"9d67030a-5092-4383-b1de-ac0c48bff4df\") " pod="kube-system/coredns-668d6bf9bc-fpt6r" Jan 23 18:31:12.329753 kubelet[2894]: I0123 18:31:12.329754 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzml4\" (UniqueName: \"kubernetes.io/projected/dad22ee7-a9d6-4858-9e53-0db48fecba12-kube-api-access-mzml4\") pod \"goldmane-666569f655-9xqc6\" (UID: \"dad22ee7-a9d6-4858-9e53-0db48fecba12\") " pod="calico-system/goldmane-666569f655-9xqc6" Jan 23 18:31:12.329947 kubelet[2894]: I0123 18:31:12.329768 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5v8gz\" (UniqueName: \"kubernetes.io/projected/9d67030a-5092-4383-b1de-ac0c48bff4df-kube-api-access-5v8gz\") pod \"coredns-668d6bf9bc-fpt6r\" (UID: \"9d67030a-5092-4383-b1de-ac0c48bff4df\") " pod="kube-system/coredns-668d6bf9bc-fpt6r" Jan 23 18:31:12.329947 kubelet[2894]: I0123 18:31:12.329781 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dad22ee7-a9d6-4858-9e53-0db48fecba12-config\") pod \"goldmane-666569f655-9xqc6\" (UID: \"dad22ee7-a9d6-4858-9e53-0db48fecba12\") " pod="calico-system/goldmane-666569f655-9xqc6" Jan 23 18:31:12.329947 kubelet[2894]: I0123 18:31:12.329805 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/644e9aff-fbfa-4d8d-bb83-94a0bb426243-calico-apiserver-certs\") pod \"calico-apiserver-779b7ffd49-78hkk\" (UID: \"644e9aff-fbfa-4d8d-bb83-94a0bb426243\") " pod="calico-apiserver/calico-apiserver-779b7ffd49-78hkk" Jan 23 18:31:12.329947 kubelet[2894]: I0123 18:31:12.329821 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dad22ee7-a9d6-4858-9e53-0db48fecba12-goldmane-ca-bundle\") pod \"goldmane-666569f655-9xqc6\" (UID: \"dad22ee7-a9d6-4858-9e53-0db48fecba12\") " pod="calico-system/goldmane-666569f655-9xqc6" Jan 23 18:31:12.329947 kubelet[2894]: I0123 18:31:12.329834 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/dad22ee7-a9d6-4858-9e53-0db48fecba12-goldmane-key-pair\") pod \"goldmane-666569f655-9xqc6\" (UID: \"dad22ee7-a9d6-4858-9e53-0db48fecba12\") " pod="calico-system/goldmane-666569f655-9xqc6" Jan 23 18:31:12.332690 systemd[1]: Created slice kubepods-burstable-pod9d67030a_5092_4383_b1de_ac0c48bff4df.slice - libcontainer container kubepods-burstable-pod9d67030a_5092_4383_b1de_ac0c48bff4df.slice. Jan 23 18:31:12.344065 systemd[1]: Created slice kubepods-burstable-pod4b639c85_fa00_406e_8d74_df95ab4cd9fb.slice - libcontainer container kubepods-burstable-pod4b639c85_fa00_406e_8d74_df95ab4cd9fb.slice. Jan 23 18:31:12.358332 systemd[1]: Created slice kubepods-besteffort-poddad22ee7_a9d6_4858_9e53_0db48fecba12.slice - libcontainer container kubepods-besteffort-poddad22ee7_a9d6_4858_9e53_0db48fecba12.slice. Jan 23 18:31:12.364301 systemd[1]: Created slice kubepods-besteffort-pod36964dc6_d2ba_456e_b129_1abb6a0d29a4.slice - libcontainer container kubepods-besteffort-pod36964dc6_d2ba_456e_b129_1abb6a0d29a4.slice. Jan 23 18:31:12.373073 systemd[1]: Created slice kubepods-besteffort-pod7a5d89a2_c80e_4f56_8808_99252854603a.slice - libcontainer container kubepods-besteffort-pod7a5d89a2_c80e_4f56_8808_99252854603a.slice. Jan 23 18:31:12.376525 systemd[1]: Created slice kubepods-besteffort-podbe525a6b_06ca_4032_a777_c6e0f1c5eb71.slice - libcontainer container kubepods-besteffort-podbe525a6b_06ca_4032_a777_c6e0f1c5eb71.slice. Jan 23 18:31:12.430343 kubelet[2894]: I0123 18:31:12.430071 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tn7s7\" (UniqueName: \"kubernetes.io/projected/7a5d89a2-c80e-4f56-8808-99252854603a-kube-api-access-tn7s7\") pod \"calico-apiserver-779b7ffd49-pbrfh\" (UID: \"7a5d89a2-c80e-4f56-8808-99252854603a\") " pod="calico-apiserver/calico-apiserver-779b7ffd49-pbrfh" Jan 23 18:31:12.430343 kubelet[2894]: I0123 18:31:12.430131 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be525a6b-06ca-4032-a777-c6e0f1c5eb71-tigera-ca-bundle\") pod \"calico-kube-controllers-8447c9595f-2lbtc\" (UID: \"be525a6b-06ca-4032-a777-c6e0f1c5eb71\") " pod="calico-system/calico-kube-controllers-8447c9595f-2lbtc" Jan 23 18:31:12.430343 kubelet[2894]: I0123 18:31:12.430168 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/36964dc6-d2ba-456e-b129-1abb6a0d29a4-whisker-backend-key-pair\") pod \"whisker-d5df66c4b-w7p84\" (UID: \"36964dc6-d2ba-456e-b129-1abb6a0d29a4\") " pod="calico-system/whisker-d5df66c4b-w7p84" Jan 23 18:31:12.430343 kubelet[2894]: I0123 18:31:12.430212 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47mxg\" (UniqueName: \"kubernetes.io/projected/36964dc6-d2ba-456e-b129-1abb6a0d29a4-kube-api-access-47mxg\") pod \"whisker-d5df66c4b-w7p84\" (UID: \"36964dc6-d2ba-456e-b129-1abb6a0d29a4\") " pod="calico-system/whisker-d5df66c4b-w7p84" Jan 23 18:31:12.430343 kubelet[2894]: I0123 18:31:12.430231 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/7a5d89a2-c80e-4f56-8808-99252854603a-calico-apiserver-certs\") pod \"calico-apiserver-779b7ffd49-pbrfh\" (UID: \"7a5d89a2-c80e-4f56-8808-99252854603a\") " pod="calico-apiserver/calico-apiserver-779b7ffd49-pbrfh" Jan 23 18:31:12.431452 kubelet[2894]: I0123 18:31:12.430900 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36964dc6-d2ba-456e-b129-1abb6a0d29a4-whisker-ca-bundle\") pod \"whisker-d5df66c4b-w7p84\" (UID: \"36964dc6-d2ba-456e-b129-1abb6a0d29a4\") " pod="calico-system/whisker-d5df66c4b-w7p84" Jan 23 18:31:12.431452 kubelet[2894]: I0123 18:31:12.430937 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cn2n5\" (UniqueName: \"kubernetes.io/projected/be525a6b-06ca-4032-a777-c6e0f1c5eb71-kube-api-access-cn2n5\") pod \"calico-kube-controllers-8447c9595f-2lbtc\" (UID: \"be525a6b-06ca-4032-a777-c6e0f1c5eb71\") " pod="calico-system/calico-kube-controllers-8447c9595f-2lbtc" Jan 23 18:31:12.832160 containerd[1690]: time="2026-01-23T18:31:12.832020629Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-779b7ffd49-78hkk,Uid:644e9aff-fbfa-4d8d-bb83-94a0bb426243,Namespace:calico-apiserver,Attempt:0,}" Jan 23 18:31:12.832694 containerd[1690]: time="2026-01-23T18:31:12.832679008Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-fpt6r,Uid:9d67030a-5092-4383-b1de-ac0c48bff4df,Namespace:kube-system,Attempt:0,}" Jan 23 18:31:12.833449 containerd[1690]: time="2026-01-23T18:31:12.833431691Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-9xqc6,Uid:dad22ee7-a9d6-4858-9e53-0db48fecba12,Namespace:calico-system,Attempt:0,}" Jan 23 18:31:12.833769 containerd[1690]: time="2026-01-23T18:31:12.833702317Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-cmmgk,Uid:4b639c85-fa00-406e-8d74-df95ab4cd9fb,Namespace:kube-system,Attempt:0,}" Jan 23 18:31:12.834036 containerd[1690]: time="2026-01-23T18:31:12.834023989Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-779b7ffd49-pbrfh,Uid:7a5d89a2-c80e-4f56-8808-99252854603a,Namespace:calico-apiserver,Attempt:0,}" Jan 23 18:31:12.969639 containerd[1690]: time="2026-01-23T18:31:12.969556740Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-d5df66c4b-w7p84,Uid:36964dc6-d2ba-456e-b129-1abb6a0d29a4,Namespace:calico-system,Attempt:0,}" Jan 23 18:31:12.979459 containerd[1690]: time="2026-01-23T18:31:12.979438219Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-8447c9595f-2lbtc,Uid:be525a6b-06ca-4032-a777-c6e0f1c5eb71,Namespace:calico-system,Attempt:0,}" Jan 23 18:31:13.384360 systemd[1]: Created slice kubepods-besteffort-pode681e1b7_9935_4d75_8509_9acd7616e3d8.slice - libcontainer container kubepods-besteffort-pode681e1b7_9935_4d75_8509_9acd7616e3d8.slice. Jan 23 18:31:13.387878 containerd[1690]: time="2026-01-23T18:31:13.387845970Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-j9dvq,Uid:e681e1b7-9935-4d75-8509-9acd7616e3d8,Namespace:calico-system,Attempt:0,}" Jan 23 18:31:13.923509 containerd[1690]: time="2026-01-23T18:31:13.923439694Z" level=error msg="Failed to destroy network for sandbox \"30867461197768e0bc2511c69bcf64187e382694e29aa49251b4a9b683b393f7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 18:31:13.929977 containerd[1690]: time="2026-01-23T18:31:13.929929599Z" level=error msg="Failed to destroy network for sandbox \"63133c6624c7379d991fb5b190b851d63b2959c39c252f57cc026bc84514dc1c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 18:31:13.930952 containerd[1690]: time="2026-01-23T18:31:13.930068278Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-9xqc6,Uid:dad22ee7-a9d6-4858-9e53-0db48fecba12,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"30867461197768e0bc2511c69bcf64187e382694e29aa49251b4a9b683b393f7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 18:31:13.931225 kubelet[2894]: E0123 18:31:13.930297 2894 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"30867461197768e0bc2511c69bcf64187e382694e29aa49251b4a9b683b393f7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 18:31:13.931225 kubelet[2894]: E0123 18:31:13.930378 2894 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"30867461197768e0bc2511c69bcf64187e382694e29aa49251b4a9b683b393f7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-9xqc6" Jan 23 18:31:13.931225 kubelet[2894]: E0123 18:31:13.930402 2894 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"30867461197768e0bc2511c69bcf64187e382694e29aa49251b4a9b683b393f7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-9xqc6" Jan 23 18:31:13.931788 kubelet[2894]: E0123 18:31:13.930452 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-666569f655-9xqc6_calico-system(dad22ee7-a9d6-4858-9e53-0db48fecba12)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-666569f655-9xqc6_calico-system(dad22ee7-a9d6-4858-9e53-0db48fecba12)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"30867461197768e0bc2511c69bcf64187e382694e29aa49251b4a9b683b393f7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-666569f655-9xqc6" podUID="dad22ee7-a9d6-4858-9e53-0db48fecba12" Jan 23 18:31:13.933088 containerd[1690]: time="2026-01-23T18:31:13.933054772Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-779b7ffd49-78hkk,Uid:644e9aff-fbfa-4d8d-bb83-94a0bb426243,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"63133c6624c7379d991fb5b190b851d63b2959c39c252f57cc026bc84514dc1c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 18:31:13.933413 kubelet[2894]: E0123 18:31:13.933228 2894 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"63133c6624c7379d991fb5b190b851d63b2959c39c252f57cc026bc84514dc1c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 18:31:13.933413 kubelet[2894]: E0123 18:31:13.933284 2894 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"63133c6624c7379d991fb5b190b851d63b2959c39c252f57cc026bc84514dc1c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-779b7ffd49-78hkk" Jan 23 18:31:13.933413 kubelet[2894]: E0123 18:31:13.933303 2894 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"63133c6624c7379d991fb5b190b851d63b2959c39c252f57cc026bc84514dc1c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-779b7ffd49-78hkk" Jan 23 18:31:13.933495 kubelet[2894]: E0123 18:31:13.933341 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-779b7ffd49-78hkk_calico-apiserver(644e9aff-fbfa-4d8d-bb83-94a0bb426243)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-779b7ffd49-78hkk_calico-apiserver(644e9aff-fbfa-4d8d-bb83-94a0bb426243)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"63133c6624c7379d991fb5b190b851d63b2959c39c252f57cc026bc84514dc1c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-779b7ffd49-78hkk" podUID="644e9aff-fbfa-4d8d-bb83-94a0bb426243" Jan 23 18:31:13.957969 containerd[1690]: time="2026-01-23T18:31:13.957848789Z" level=error msg="Failed to destroy network for sandbox \"9ff3c10a9f2987c2d07f4415c629a64f1f5c1a9eac4bec8e8fb0c9860b8a37d3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 18:31:13.962573 containerd[1690]: time="2026-01-23T18:31:13.962482169Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-d5df66c4b-w7p84,Uid:36964dc6-d2ba-456e-b129-1abb6a0d29a4,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9ff3c10a9f2987c2d07f4415c629a64f1f5c1a9eac4bec8e8fb0c9860b8a37d3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 18:31:13.963036 kubelet[2894]: E0123 18:31:13.962992 2894 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9ff3c10a9f2987c2d07f4415c629a64f1f5c1a9eac4bec8e8fb0c9860b8a37d3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 18:31:13.963100 kubelet[2894]: E0123 18:31:13.963052 2894 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9ff3c10a9f2987c2d07f4415c629a64f1f5c1a9eac4bec8e8fb0c9860b8a37d3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-d5df66c4b-w7p84" Jan 23 18:31:13.963100 kubelet[2894]: E0123 18:31:13.963083 2894 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9ff3c10a9f2987c2d07f4415c629a64f1f5c1a9eac4bec8e8fb0c9860b8a37d3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-d5df66c4b-w7p84" Jan 23 18:31:13.964319 kubelet[2894]: E0123 18:31:13.963122 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-d5df66c4b-w7p84_calico-system(36964dc6-d2ba-456e-b129-1abb6a0d29a4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-d5df66c4b-w7p84_calico-system(36964dc6-d2ba-456e-b129-1abb6a0d29a4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9ff3c10a9f2987c2d07f4415c629a64f1f5c1a9eac4bec8e8fb0c9860b8a37d3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-d5df66c4b-w7p84" podUID="36964dc6-d2ba-456e-b129-1abb6a0d29a4" Jan 23 18:31:13.979734 containerd[1690]: time="2026-01-23T18:31:13.979686710Z" level=error msg="Failed to destroy network for sandbox \"24940f708660e682169d8b93ad7f17bd360af984c06a004e3311745b60086123\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 18:31:13.983862 containerd[1690]: time="2026-01-23T18:31:13.983821281Z" level=error msg="Failed to destroy network for sandbox \"c02cf0e74209c8e13f45bb663ad4a7e296be628620e734abc41f120020c4ed63\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 18:31:13.984233 containerd[1690]: time="2026-01-23T18:31:13.984145879Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-779b7ffd49-pbrfh,Uid:7a5d89a2-c80e-4f56-8808-99252854603a,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"24940f708660e682169d8b93ad7f17bd360af984c06a004e3311745b60086123\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 18:31:13.984583 kubelet[2894]: E0123 18:31:13.984488 2894 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"24940f708660e682169d8b93ad7f17bd360af984c06a004e3311745b60086123\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 18:31:13.984583 kubelet[2894]: E0123 18:31:13.984542 2894 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"24940f708660e682169d8b93ad7f17bd360af984c06a004e3311745b60086123\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-779b7ffd49-pbrfh" Jan 23 18:31:13.984583 kubelet[2894]: E0123 18:31:13.984562 2894 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"24940f708660e682169d8b93ad7f17bd360af984c06a004e3311745b60086123\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-779b7ffd49-pbrfh" Jan 23 18:31:13.984707 kubelet[2894]: E0123 18:31:13.984601 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-779b7ffd49-pbrfh_calico-apiserver(7a5d89a2-c80e-4f56-8808-99252854603a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-779b7ffd49-pbrfh_calico-apiserver(7a5d89a2-c80e-4f56-8808-99252854603a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"24940f708660e682169d8b93ad7f17bd360af984c06a004e3311745b60086123\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-779b7ffd49-pbrfh" podUID="7a5d89a2-c80e-4f56-8808-99252854603a" Jan 23 18:31:13.986616 containerd[1690]: time="2026-01-23T18:31:13.986586202Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-j9dvq,Uid:e681e1b7-9935-4d75-8509-9acd7616e3d8,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c02cf0e74209c8e13f45bb663ad4a7e296be628620e734abc41f120020c4ed63\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 18:31:13.986954 kubelet[2894]: E0123 18:31:13.986866 2894 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c02cf0e74209c8e13f45bb663ad4a7e296be628620e734abc41f120020c4ed63\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 18:31:13.987008 kubelet[2894]: E0123 18:31:13.986970 2894 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c02cf0e74209c8e13f45bb663ad4a7e296be628620e734abc41f120020c4ed63\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-j9dvq" Jan 23 18:31:13.987008 kubelet[2894]: E0123 18:31:13.986987 2894 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c02cf0e74209c8e13f45bb663ad4a7e296be628620e734abc41f120020c4ed63\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-j9dvq" Jan 23 18:31:13.987474 kubelet[2894]: E0123 18:31:13.987030 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-j9dvq_calico-system(e681e1b7-9935-4d75-8509-9acd7616e3d8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-j9dvq_calico-system(e681e1b7-9935-4d75-8509-9acd7616e3d8)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c02cf0e74209c8e13f45bb663ad4a7e296be628620e734abc41f120020c4ed63\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-j9dvq" podUID="e681e1b7-9935-4d75-8509-9acd7616e3d8" Jan 23 18:31:13.988193 containerd[1690]: time="2026-01-23T18:31:13.988169139Z" level=error msg="Failed to destroy network for sandbox \"8e2d8d59cfb62c698c710dab7cba8c2b1880411bb98bc82ef2582b5089be9d98\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 18:31:13.994094 containerd[1690]: time="2026-01-23T18:31:13.994068494Z" level=error msg="Failed to destroy network for sandbox \"1818e9e2a0bc28238bc95c769af1bebcad51af3e36b46e0dfe2057e0df7d5c65\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 18:31:13.995234 containerd[1690]: time="2026-01-23T18:31:13.995167101Z" level=error msg="Failed to destroy network for sandbox \"12ac4faccd620430aa25ac446d7ec3887c8005f5f9db22d3d3e5c9ce91bc1e04\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 18:31:13.995520 containerd[1690]: time="2026-01-23T18:31:13.995493931Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-fpt6r,Uid:9d67030a-5092-4383-b1de-ac0c48bff4df,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"8e2d8d59cfb62c698c710dab7cba8c2b1880411bb98bc82ef2582b5089be9d98\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 18:31:13.995901 kubelet[2894]: E0123 18:31:13.995873 2894 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8e2d8d59cfb62c698c710dab7cba8c2b1880411bb98bc82ef2582b5089be9d98\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 18:31:13.996007 kubelet[2894]: E0123 18:31:13.995991 2894 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8e2d8d59cfb62c698c710dab7cba8c2b1880411bb98bc82ef2582b5089be9d98\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-fpt6r" Jan 23 18:31:13.996130 kubelet[2894]: E0123 18:31:13.996051 2894 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8e2d8d59cfb62c698c710dab7cba8c2b1880411bb98bc82ef2582b5089be9d98\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-fpt6r" Jan 23 18:31:13.996194 kubelet[2894]: E0123 18:31:13.996093 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-fpt6r_kube-system(9d67030a-5092-4383-b1de-ac0c48bff4df)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-fpt6r_kube-system(9d67030a-5092-4383-b1de-ac0c48bff4df)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8e2d8d59cfb62c698c710dab7cba8c2b1880411bb98bc82ef2582b5089be9d98\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-fpt6r" podUID="9d67030a-5092-4383-b1de-ac0c48bff4df" Jan 23 18:31:14.000537 containerd[1690]: time="2026-01-23T18:31:14.000342176Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-cmmgk,Uid:4b639c85-fa00-406e-8d74-df95ab4cd9fb,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"1818e9e2a0bc28238bc95c769af1bebcad51af3e36b46e0dfe2057e0df7d5c65\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 18:31:14.000619 kubelet[2894]: E0123 18:31:14.000494 2894 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1818e9e2a0bc28238bc95c769af1bebcad51af3e36b46e0dfe2057e0df7d5c65\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 18:31:14.000619 kubelet[2894]: E0123 18:31:14.000532 2894 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1818e9e2a0bc28238bc95c769af1bebcad51af3e36b46e0dfe2057e0df7d5c65\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-cmmgk" Jan 23 18:31:14.000619 kubelet[2894]: E0123 18:31:14.000554 2894 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1818e9e2a0bc28238bc95c769af1bebcad51af3e36b46e0dfe2057e0df7d5c65\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-cmmgk" Jan 23 18:31:14.000693 kubelet[2894]: E0123 18:31:14.000584 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-cmmgk_kube-system(4b639c85-fa00-406e-8d74-df95ab4cd9fb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-cmmgk_kube-system(4b639c85-fa00-406e-8d74-df95ab4cd9fb)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1818e9e2a0bc28238bc95c769af1bebcad51af3e36b46e0dfe2057e0df7d5c65\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-cmmgk" podUID="4b639c85-fa00-406e-8d74-df95ab4cd9fb" Jan 23 18:31:14.003112 containerd[1690]: time="2026-01-23T18:31:14.002852932Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-8447c9595f-2lbtc,Uid:be525a6b-06ca-4032-a777-c6e0f1c5eb71,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"12ac4faccd620430aa25ac446d7ec3887c8005f5f9db22d3d3e5c9ce91bc1e04\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 18:31:14.003334 kubelet[2894]: E0123 18:31:14.003305 2894 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"12ac4faccd620430aa25ac446d7ec3887c8005f5f9db22d3d3e5c9ce91bc1e04\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 18:31:14.003374 kubelet[2894]: E0123 18:31:14.003333 2894 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"12ac4faccd620430aa25ac446d7ec3887c8005f5f9db22d3d3e5c9ce91bc1e04\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-8447c9595f-2lbtc" Jan 23 18:31:14.003398 kubelet[2894]: E0123 18:31:14.003352 2894 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"12ac4faccd620430aa25ac446d7ec3887c8005f5f9db22d3d3e5c9ce91bc1e04\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-8447c9595f-2lbtc" Jan 23 18:31:14.003421 kubelet[2894]: E0123 18:31:14.003402 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-8447c9595f-2lbtc_calico-system(be525a6b-06ca-4032-a777-c6e0f1c5eb71)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-8447c9595f-2lbtc_calico-system(be525a6b-06ca-4032-a777-c6e0f1c5eb71)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"12ac4faccd620430aa25ac446d7ec3887c8005f5f9db22d3d3e5c9ce91bc1e04\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-8447c9595f-2lbtc" podUID="be525a6b-06ca-4032-a777-c6e0f1c5eb71" Jan 23 18:31:14.281701 systemd[1]: run-netns-cni\x2da9b7994d\x2de377\x2d536a\x2df0c3\x2ddda948d3caa7.mount: Deactivated successfully. Jan 23 18:31:14.282699 systemd[1]: run-netns-cni\x2d94d52517\x2d7121\x2d37c8\x2d40d0\x2d907398d27a32.mount: Deactivated successfully. Jan 23 18:31:14.282749 systemd[1]: run-netns-cni\x2d97b81a6b\x2d7a19\x2d50cf\x2d0b75\x2dd4c2e4fd0171.mount: Deactivated successfully. Jan 23 18:31:14.282795 systemd[1]: run-netns-cni\x2daa91d1a5\x2d4a96\x2d78d8\x2d71b4\x2d655cde0aba76.mount: Deactivated successfully. Jan 23 18:31:14.282841 systemd[1]: run-netns-cni\x2d964c0726\x2d8277\x2d1732\x2d821f\x2dce1ea037ccd7.mount: Deactivated successfully. Jan 23 18:31:14.282890 systemd[1]: run-netns-cni\x2da80fb447\x2df4eb\x2d523d\x2d78cf\x2d7f85876e9956.mount: Deactivated successfully. Jan 23 18:31:14.282935 systemd[1]: run-netns-cni\x2dbf19265c\x2d506e\x2d6c50\x2d8064\x2d9b2174f0d34c.mount: Deactivated successfully. Jan 23 18:31:14.496116 containerd[1690]: time="2026-01-23T18:31:14.496062063Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\"" Jan 23 18:31:21.763910 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4089097296.mount: Deactivated successfully. Jan 23 18:31:21.862147 containerd[1690]: time="2026-01-23T18:31:21.862075606Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 18:31:21.863155 containerd[1690]: time="2026-01-23T18:31:21.863034926Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.4: active requests=0, bytes read=156880025" Jan 23 18:31:21.863908 containerd[1690]: time="2026-01-23T18:31:21.863881113Z" level=info msg="ImageCreate event name:\"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 18:31:21.866858 containerd[1690]: time="2026-01-23T18:31:21.866480646Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 18:31:21.866858 containerd[1690]: time="2026-01-23T18:31:21.866757970Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.4\" with image id \"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\", size \"156883537\" in 7.370653178s" Jan 23 18:31:21.866858 containerd[1690]: time="2026-01-23T18:31:21.866780918Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\" returns image reference \"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\"" Jan 23 18:31:21.879841 containerd[1690]: time="2026-01-23T18:31:21.879795496Z" level=info msg="CreateContainer within sandbox \"1efb2b4a99403353a650273a41490376508ac426ba9b8b560e8d4dff6ab6fe13\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Jan 23 18:31:21.897790 containerd[1690]: time="2026-01-23T18:31:21.897305388Z" level=info msg="Container 395d3f248b3d27363820e0d1ceb62015e419f1e7445ff66e62720b939b64d94a: CDI devices from CRI Config.CDIDevices: []" Jan 23 18:31:21.911553 containerd[1690]: time="2026-01-23T18:31:21.911508770Z" level=info msg="CreateContainer within sandbox \"1efb2b4a99403353a650273a41490376508ac426ba9b8b560e8d4dff6ab6fe13\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"395d3f248b3d27363820e0d1ceb62015e419f1e7445ff66e62720b939b64d94a\"" Jan 23 18:31:21.912288 containerd[1690]: time="2026-01-23T18:31:21.912032545Z" level=info msg="StartContainer for \"395d3f248b3d27363820e0d1ceb62015e419f1e7445ff66e62720b939b64d94a\"" Jan 23 18:31:21.914391 containerd[1690]: time="2026-01-23T18:31:21.914357155Z" level=info msg="connecting to shim 395d3f248b3d27363820e0d1ceb62015e419f1e7445ff66e62720b939b64d94a" address="unix:///run/containerd/s/58f7af9f937b6c7ae459f14b71334f9d99d0ffa40e559799802a1193eeb75e63" protocol=ttrpc version=3 Jan 23 18:31:21.933450 systemd[1]: Started cri-containerd-395d3f248b3d27363820e0d1ceb62015e419f1e7445ff66e62720b939b64d94a.scope - libcontainer container 395d3f248b3d27363820e0d1ceb62015e419f1e7445ff66e62720b939b64d94a. Jan 23 18:31:21.976000 audit: BPF prog-id=172 op=LOAD Jan 23 18:31:21.978651 kernel: kauditd_printk_skb: 12 callbacks suppressed Jan 23 18:31:21.978696 kernel: audit: type=1334 audit(1769193081.976:577): prog-id=172 op=LOAD Jan 23 18:31:21.976000 audit[3925]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=3437 pid=3925 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:21.982062 kernel: audit: type=1300 audit(1769193081.976:577): arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=3437 pid=3925 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:21.976000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3339356433663234386233643237333633383230653064316365623632 Jan 23 18:31:21.986068 kernel: audit: type=1327 audit(1769193081.976:577): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3339356433663234386233643237333633383230653064316365623632 Jan 23 18:31:21.976000 audit: BPF prog-id=173 op=LOAD Jan 23 18:31:21.989637 kernel: audit: type=1334 audit(1769193081.976:578): prog-id=173 op=LOAD Jan 23 18:31:21.976000 audit[3925]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=3437 pid=3925 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:21.991663 kernel: audit: type=1300 audit(1769193081.976:578): arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=3437 pid=3925 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:21.976000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3339356433663234386233643237333633383230653064316365623632 Jan 23 18:31:21.995572 kernel: audit: type=1327 audit(1769193081.976:578): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3339356433663234386233643237333633383230653064316365623632 Jan 23 18:31:21.976000 audit: BPF prog-id=173 op=UNLOAD Jan 23 18:31:21.998879 kernel: audit: type=1334 audit(1769193081.976:579): prog-id=173 op=UNLOAD Jan 23 18:31:21.999066 kernel: audit: type=1300 audit(1769193081.976:579): arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3437 pid=3925 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:21.976000 audit[3925]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3437 pid=3925 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:21.976000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3339356433663234386233643237333633383230653064316365623632 Jan 23 18:31:22.004466 kernel: audit: type=1327 audit(1769193081.976:579): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3339356433663234386233643237333633383230653064316365623632 Jan 23 18:31:22.007770 kernel: audit: type=1334 audit(1769193081.976:580): prog-id=172 op=UNLOAD Jan 23 18:31:21.976000 audit: BPF prog-id=172 op=UNLOAD Jan 23 18:31:21.976000 audit[3925]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3437 pid=3925 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:21.976000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3339356433663234386233643237333633383230653064316365623632 Jan 23 18:31:21.976000 audit: BPF prog-id=174 op=LOAD Jan 23 18:31:21.976000 audit[3925]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=3437 pid=3925 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:21.976000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3339356433663234386233643237333633383230653064316365623632 Jan 23 18:31:22.021622 containerd[1690]: time="2026-01-23T18:31:22.021305820Z" level=info msg="StartContainer for \"395d3f248b3d27363820e0d1ceb62015e419f1e7445ff66e62720b939b64d94a\" returns successfully" Jan 23 18:31:22.098601 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Jan 23 18:31:22.098726 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Jan 23 18:31:22.298431 kubelet[2894]: I0123 18:31:22.298010 2894 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36964dc6-d2ba-456e-b129-1abb6a0d29a4-whisker-ca-bundle\") pod \"36964dc6-d2ba-456e-b129-1abb6a0d29a4\" (UID: \"36964dc6-d2ba-456e-b129-1abb6a0d29a4\") " Jan 23 18:31:22.299106 kubelet[2894]: I0123 18:31:22.298789 2894 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47mxg\" (UniqueName: \"kubernetes.io/projected/36964dc6-d2ba-456e-b129-1abb6a0d29a4-kube-api-access-47mxg\") pod \"36964dc6-d2ba-456e-b129-1abb6a0d29a4\" (UID: \"36964dc6-d2ba-456e-b129-1abb6a0d29a4\") " Jan 23 18:31:22.299106 kubelet[2894]: I0123 18:31:22.298829 2894 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/36964dc6-d2ba-456e-b129-1abb6a0d29a4-whisker-backend-key-pair\") pod \"36964dc6-d2ba-456e-b129-1abb6a0d29a4\" (UID: \"36964dc6-d2ba-456e-b129-1abb6a0d29a4\") " Jan 23 18:31:22.299197 kubelet[2894]: I0123 18:31:22.298369 2894 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36964dc6-d2ba-456e-b129-1abb6a0d29a4-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "36964dc6-d2ba-456e-b129-1abb6a0d29a4" (UID: "36964dc6-d2ba-456e-b129-1abb6a0d29a4"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Jan 23 18:31:22.302701 kubelet[2894]: I0123 18:31:22.302679 2894 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36964dc6-d2ba-456e-b129-1abb6a0d29a4-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "36964dc6-d2ba-456e-b129-1abb6a0d29a4" (UID: "36964dc6-d2ba-456e-b129-1abb6a0d29a4"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Jan 23 18:31:22.302935 kubelet[2894]: I0123 18:31:22.302912 2894 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36964dc6-d2ba-456e-b129-1abb6a0d29a4-kube-api-access-47mxg" (OuterVolumeSpecName: "kube-api-access-47mxg") pod "36964dc6-d2ba-456e-b129-1abb6a0d29a4" (UID: "36964dc6-d2ba-456e-b129-1abb6a0d29a4"). InnerVolumeSpecName "kube-api-access-47mxg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Jan 23 18:31:22.400074 kubelet[2894]: I0123 18:31:22.400038 2894 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36964dc6-d2ba-456e-b129-1abb6a0d29a4-whisker-ca-bundle\") on node \"ci-4547-1-0-2-32611d5cc2\" DevicePath \"\"" Jan 23 18:31:22.400250 kubelet[2894]: I0123 18:31:22.400220 2894 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-47mxg\" (UniqueName: \"kubernetes.io/projected/36964dc6-d2ba-456e-b129-1abb6a0d29a4-kube-api-access-47mxg\") on node \"ci-4547-1-0-2-32611d5cc2\" DevicePath \"\"" Jan 23 18:31:22.400250 kubelet[2894]: I0123 18:31:22.400236 2894 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/36964dc6-d2ba-456e-b129-1abb6a0d29a4-whisker-backend-key-pair\") on node \"ci-4547-1-0-2-32611d5cc2\" DevicePath \"\"" Jan 23 18:31:22.521019 systemd[1]: Removed slice kubepods-besteffort-pod36964dc6_d2ba_456e_b129_1abb6a0d29a4.slice - libcontainer container kubepods-besteffort-pod36964dc6_d2ba_456e_b129_1abb6a0d29a4.slice. Jan 23 18:31:22.695298 kubelet[2894]: I0123 18:31:22.695005 2894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-468qx" podStartSLOduration=2.051928178 podStartE2EDuration="23.694991311s" podCreationTimestamp="2026-01-23 18:30:59 +0000 UTC" firstStartedPulling="2026-01-23 18:31:00.224566107 +0000 UTC m=+22.931901650" lastFinishedPulling="2026-01-23 18:31:21.867629239 +0000 UTC m=+44.574964783" observedRunningTime="2026-01-23 18:31:22.689292359 +0000 UTC m=+45.396627915" watchObservedRunningTime="2026-01-23 18:31:22.694991311 +0000 UTC m=+45.402326887" Jan 23 18:31:22.764936 systemd[1]: var-lib-kubelet-pods-36964dc6\x2dd2ba\x2d456e\x2db129\x2d1abb6a0d29a4-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d47mxg.mount: Deactivated successfully. Jan 23 18:31:22.765022 systemd[1]: var-lib-kubelet-pods-36964dc6\x2dd2ba\x2d456e\x2db129\x2d1abb6a0d29a4-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Jan 23 18:31:22.791060 systemd[1]: Created slice kubepods-besteffort-poda7bedb6c_04ad_4dfc_97e0_53467bd29e69.slice - libcontainer container kubepods-besteffort-poda7bedb6c_04ad_4dfc_97e0_53467bd29e69.slice. Jan 23 18:31:22.802213 kubelet[2894]: I0123 18:31:22.802180 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvjpk\" (UniqueName: \"kubernetes.io/projected/a7bedb6c-04ad-4dfc-97e0-53467bd29e69-kube-api-access-qvjpk\") pod \"whisker-775b766696-sj9gw\" (UID: \"a7bedb6c-04ad-4dfc-97e0-53467bd29e69\") " pod="calico-system/whisker-775b766696-sj9gw" Jan 23 18:31:22.802213 kubelet[2894]: I0123 18:31:22.802216 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/a7bedb6c-04ad-4dfc-97e0-53467bd29e69-whisker-backend-key-pair\") pod \"whisker-775b766696-sj9gw\" (UID: \"a7bedb6c-04ad-4dfc-97e0-53467bd29e69\") " pod="calico-system/whisker-775b766696-sj9gw" Jan 23 18:31:22.802369 kubelet[2894]: I0123 18:31:22.802233 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7bedb6c-04ad-4dfc-97e0-53467bd29e69-whisker-ca-bundle\") pod \"whisker-775b766696-sj9gw\" (UID: \"a7bedb6c-04ad-4dfc-97e0-53467bd29e69\") " pod="calico-system/whisker-775b766696-sj9gw" Jan 23 18:31:23.095781 containerd[1690]: time="2026-01-23T18:31:23.095534345Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-775b766696-sj9gw,Uid:a7bedb6c-04ad-4dfc-97e0-53467bd29e69,Namespace:calico-system,Attempt:0,}" Jan 23 18:31:23.253196 systemd-networkd[1583]: caliab5310f6468: Link UP Jan 23 18:31:23.253744 systemd-networkd[1583]: caliab5310f6468: Gained carrier Jan 23 18:31:23.274918 containerd[1690]: 2026-01-23 18:31:23.119 [INFO][4013] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 23 18:31:23.274918 containerd[1690]: 2026-01-23 18:31:23.184 [INFO][4013] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--1--0--2--32611d5cc2-k8s-whisker--775b766696--sj9gw-eth0 whisker-775b766696- calico-system a7bedb6c-04ad-4dfc-97e0-53467bd29e69 875 0 2026-01-23 18:31:22 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:775b766696 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4547-1-0-2-32611d5cc2 whisker-775b766696-sj9gw eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] caliab5310f6468 [] [] }} ContainerID="ded00696f41182480fe57e21a4de7a2c50f335803221d22726441d7affb8857e" Namespace="calico-system" Pod="whisker-775b766696-sj9gw" WorkloadEndpoint="ci--4547--1--0--2--32611d5cc2-k8s-whisker--775b766696--sj9gw-" Jan 23 18:31:23.274918 containerd[1690]: 2026-01-23 18:31:23.185 [INFO][4013] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="ded00696f41182480fe57e21a4de7a2c50f335803221d22726441d7affb8857e" Namespace="calico-system" Pod="whisker-775b766696-sj9gw" WorkloadEndpoint="ci--4547--1--0--2--32611d5cc2-k8s-whisker--775b766696--sj9gw-eth0" Jan 23 18:31:23.274918 containerd[1690]: 2026-01-23 18:31:23.208 [INFO][4025] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ded00696f41182480fe57e21a4de7a2c50f335803221d22726441d7affb8857e" HandleID="k8s-pod-network.ded00696f41182480fe57e21a4de7a2c50f335803221d22726441d7affb8857e" Workload="ci--4547--1--0--2--32611d5cc2-k8s-whisker--775b766696--sj9gw-eth0" Jan 23 18:31:23.275147 containerd[1690]: 2026-01-23 18:31:23.208 [INFO][4025] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="ded00696f41182480fe57e21a4de7a2c50f335803221d22726441d7affb8857e" HandleID="k8s-pod-network.ded00696f41182480fe57e21a4de7a2c50f335803221d22726441d7affb8857e" Workload="ci--4547--1--0--2--32611d5cc2-k8s-whisker--775b766696--sj9gw-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d55a0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4547-1-0-2-32611d5cc2", "pod":"whisker-775b766696-sj9gw", "timestamp":"2026-01-23 18:31:23.208133656 +0000 UTC"}, Hostname:"ci-4547-1-0-2-32611d5cc2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 23 18:31:23.275147 containerd[1690]: 2026-01-23 18:31:23.208 [INFO][4025] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 23 18:31:23.275147 containerd[1690]: 2026-01-23 18:31:23.208 [INFO][4025] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 23 18:31:23.275147 containerd[1690]: 2026-01-23 18:31:23.208 [INFO][4025] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-1-0-2-32611d5cc2' Jan 23 18:31:23.275147 containerd[1690]: 2026-01-23 18:31:23.214 [INFO][4025] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.ded00696f41182480fe57e21a4de7a2c50f335803221d22726441d7affb8857e" host="ci-4547-1-0-2-32611d5cc2" Jan 23 18:31:23.275147 containerd[1690]: 2026-01-23 18:31:23.219 [INFO][4025] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-1-0-2-32611d5cc2" Jan 23 18:31:23.275147 containerd[1690]: 2026-01-23 18:31:23.223 [INFO][4025] ipam/ipam.go 511: Trying affinity for 192.168.28.128/26 host="ci-4547-1-0-2-32611d5cc2" Jan 23 18:31:23.275147 containerd[1690]: 2026-01-23 18:31:23.225 [INFO][4025] ipam/ipam.go 158: Attempting to load block cidr=192.168.28.128/26 host="ci-4547-1-0-2-32611d5cc2" Jan 23 18:31:23.275147 containerd[1690]: 2026-01-23 18:31:23.227 [INFO][4025] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.28.128/26 host="ci-4547-1-0-2-32611d5cc2" Jan 23 18:31:23.275358 containerd[1690]: 2026-01-23 18:31:23.227 [INFO][4025] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.28.128/26 handle="k8s-pod-network.ded00696f41182480fe57e21a4de7a2c50f335803221d22726441d7affb8857e" host="ci-4547-1-0-2-32611d5cc2" Jan 23 18:31:23.275358 containerd[1690]: 2026-01-23 18:31:23.228 [INFO][4025] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.ded00696f41182480fe57e21a4de7a2c50f335803221d22726441d7affb8857e Jan 23 18:31:23.275358 containerd[1690]: 2026-01-23 18:31:23.234 [INFO][4025] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.28.128/26 handle="k8s-pod-network.ded00696f41182480fe57e21a4de7a2c50f335803221d22726441d7affb8857e" host="ci-4547-1-0-2-32611d5cc2" Jan 23 18:31:23.275358 containerd[1690]: 2026-01-23 18:31:23.239 [INFO][4025] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.28.129/26] block=192.168.28.128/26 handle="k8s-pod-network.ded00696f41182480fe57e21a4de7a2c50f335803221d22726441d7affb8857e" host="ci-4547-1-0-2-32611d5cc2" Jan 23 18:31:23.275358 containerd[1690]: 2026-01-23 18:31:23.239 [INFO][4025] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.28.129/26] handle="k8s-pod-network.ded00696f41182480fe57e21a4de7a2c50f335803221d22726441d7affb8857e" host="ci-4547-1-0-2-32611d5cc2" Jan 23 18:31:23.275358 containerd[1690]: 2026-01-23 18:31:23.239 [INFO][4025] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 23 18:31:23.275358 containerd[1690]: 2026-01-23 18:31:23.239 [INFO][4025] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.28.129/26] IPv6=[] ContainerID="ded00696f41182480fe57e21a4de7a2c50f335803221d22726441d7affb8857e" HandleID="k8s-pod-network.ded00696f41182480fe57e21a4de7a2c50f335803221d22726441d7affb8857e" Workload="ci--4547--1--0--2--32611d5cc2-k8s-whisker--775b766696--sj9gw-eth0" Jan 23 18:31:23.275506 containerd[1690]: 2026-01-23 18:31:23.242 [INFO][4013] cni-plugin/k8s.go 418: Populated endpoint ContainerID="ded00696f41182480fe57e21a4de7a2c50f335803221d22726441d7affb8857e" Namespace="calico-system" Pod="whisker-775b766696-sj9gw" WorkloadEndpoint="ci--4547--1--0--2--32611d5cc2-k8s-whisker--775b766696--sj9gw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--1--0--2--32611d5cc2-k8s-whisker--775b766696--sj9gw-eth0", GenerateName:"whisker-775b766696-", Namespace:"calico-system", SelfLink:"", UID:"a7bedb6c-04ad-4dfc-97e0-53467bd29e69", ResourceVersion:"875", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 18, 31, 22, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"775b766696", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-1-0-2-32611d5cc2", ContainerID:"", Pod:"whisker-775b766696-sj9gw", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.28.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"caliab5310f6468", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 18:31:23.275506 containerd[1690]: 2026-01-23 18:31:23.242 [INFO][4013] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.28.129/32] ContainerID="ded00696f41182480fe57e21a4de7a2c50f335803221d22726441d7affb8857e" Namespace="calico-system" Pod="whisker-775b766696-sj9gw" WorkloadEndpoint="ci--4547--1--0--2--32611d5cc2-k8s-whisker--775b766696--sj9gw-eth0" Jan 23 18:31:23.275580 containerd[1690]: 2026-01-23 18:31:23.242 [INFO][4013] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliab5310f6468 ContainerID="ded00696f41182480fe57e21a4de7a2c50f335803221d22726441d7affb8857e" Namespace="calico-system" Pod="whisker-775b766696-sj9gw" WorkloadEndpoint="ci--4547--1--0--2--32611d5cc2-k8s-whisker--775b766696--sj9gw-eth0" Jan 23 18:31:23.275580 containerd[1690]: 2026-01-23 18:31:23.252 [INFO][4013] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ded00696f41182480fe57e21a4de7a2c50f335803221d22726441d7affb8857e" Namespace="calico-system" Pod="whisker-775b766696-sj9gw" WorkloadEndpoint="ci--4547--1--0--2--32611d5cc2-k8s-whisker--775b766696--sj9gw-eth0" Jan 23 18:31:23.275621 containerd[1690]: 2026-01-23 18:31:23.253 [INFO][4013] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="ded00696f41182480fe57e21a4de7a2c50f335803221d22726441d7affb8857e" Namespace="calico-system" Pod="whisker-775b766696-sj9gw" WorkloadEndpoint="ci--4547--1--0--2--32611d5cc2-k8s-whisker--775b766696--sj9gw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--1--0--2--32611d5cc2-k8s-whisker--775b766696--sj9gw-eth0", GenerateName:"whisker-775b766696-", Namespace:"calico-system", SelfLink:"", UID:"a7bedb6c-04ad-4dfc-97e0-53467bd29e69", ResourceVersion:"875", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 18, 31, 22, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"775b766696", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-1-0-2-32611d5cc2", ContainerID:"ded00696f41182480fe57e21a4de7a2c50f335803221d22726441d7affb8857e", Pod:"whisker-775b766696-sj9gw", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.28.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"caliab5310f6468", MAC:"ee:f1:64:22:75:96", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 18:31:23.275675 containerd[1690]: 2026-01-23 18:31:23.273 [INFO][4013] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="ded00696f41182480fe57e21a4de7a2c50f335803221d22726441d7affb8857e" Namespace="calico-system" Pod="whisker-775b766696-sj9gw" WorkloadEndpoint="ci--4547--1--0--2--32611d5cc2-k8s-whisker--775b766696--sj9gw-eth0" Jan 23 18:31:23.297876 containerd[1690]: time="2026-01-23T18:31:23.297839957Z" level=info msg="connecting to shim ded00696f41182480fe57e21a4de7a2c50f335803221d22726441d7affb8857e" address="unix:///run/containerd/s/7e04c48733818350b55d3284954d5a7486f5fd0afdbe7ac4313c977b8a1b8c97" namespace=k8s.io protocol=ttrpc version=3 Jan 23 18:31:23.326516 systemd[1]: Started cri-containerd-ded00696f41182480fe57e21a4de7a2c50f335803221d22726441d7affb8857e.scope - libcontainer container ded00696f41182480fe57e21a4de7a2c50f335803221d22726441d7affb8857e. Jan 23 18:31:23.335000 audit: BPF prog-id=175 op=LOAD Jan 23 18:31:23.335000 audit: BPF prog-id=176 op=LOAD Jan 23 18:31:23.335000 audit[4057]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000138238 a2=98 a3=0 items=0 ppid=4045 pid=4057 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:23.335000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6465643030363936663431313832343830666535376532316134646537 Jan 23 18:31:23.336000 audit: BPF prog-id=176 op=UNLOAD Jan 23 18:31:23.336000 audit[4057]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4045 pid=4057 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:23.336000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6465643030363936663431313832343830666535376532316134646537 Jan 23 18:31:23.336000 audit: BPF prog-id=177 op=LOAD Jan 23 18:31:23.336000 audit[4057]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000138488 a2=98 a3=0 items=0 ppid=4045 pid=4057 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:23.336000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6465643030363936663431313832343830666535376532316134646537 Jan 23 18:31:23.336000 audit: BPF prog-id=178 op=LOAD Jan 23 18:31:23.336000 audit[4057]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000138218 a2=98 a3=0 items=0 ppid=4045 pid=4057 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:23.336000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6465643030363936663431313832343830666535376532316134646537 Jan 23 18:31:23.336000 audit: BPF prog-id=178 op=UNLOAD Jan 23 18:31:23.336000 audit[4057]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4045 pid=4057 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:23.336000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6465643030363936663431313832343830666535376532316134646537 Jan 23 18:31:23.336000 audit: BPF prog-id=177 op=UNLOAD Jan 23 18:31:23.336000 audit[4057]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4045 pid=4057 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:23.336000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6465643030363936663431313832343830666535376532316134646537 Jan 23 18:31:23.336000 audit: BPF prog-id=179 op=LOAD Jan 23 18:31:23.336000 audit[4057]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001386e8 a2=98 a3=0 items=0 ppid=4045 pid=4057 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:23.336000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6465643030363936663431313832343830666535376532316134646537 Jan 23 18:31:23.368630 containerd[1690]: time="2026-01-23T18:31:23.368584482Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-775b766696-sj9gw,Uid:a7bedb6c-04ad-4dfc-97e0-53467bd29e69,Namespace:calico-system,Attempt:0,} returns sandbox id \"ded00696f41182480fe57e21a4de7a2c50f335803221d22726441d7affb8857e\"" Jan 23 18:31:23.370370 containerd[1690]: time="2026-01-23T18:31:23.369985717Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 23 18:31:23.378560 kubelet[2894]: I0123 18:31:23.378413 2894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36964dc6-d2ba-456e-b129-1abb6a0d29a4" path="/var/lib/kubelet/pods/36964dc6-d2ba-456e-b129-1abb6a0d29a4/volumes" Jan 23 18:31:23.711236 containerd[1690]: time="2026-01-23T18:31:23.711132746Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 18:31:23.712400 containerd[1690]: time="2026-01-23T18:31:23.712371796Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 23 18:31:23.712472 containerd[1690]: time="2026-01-23T18:31:23.712387559Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 23 18:31:23.712748 kubelet[2894]: E0123 18:31:23.712577 2894 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 23 18:31:23.712748 kubelet[2894]: E0123 18:31:23.712622 2894 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 23 18:31:23.717998 kubelet[2894]: E0123 18:31:23.717933 2894 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:9dc376b3bb22419f9d12e1d73f9667ce,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qvjpk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-775b766696-sj9gw_calico-system(a7bedb6c-04ad-4dfc-97e0-53467bd29e69): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 23 18:31:23.719841 containerd[1690]: time="2026-01-23T18:31:23.719805087Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 23 18:31:24.039600 containerd[1690]: time="2026-01-23T18:31:24.038518513Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 18:31:24.041432 containerd[1690]: time="2026-01-23T18:31:24.041317866Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 23 18:31:24.041432 containerd[1690]: time="2026-01-23T18:31:24.041409433Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 23 18:31:24.041705 kubelet[2894]: E0123 18:31:24.041657 2894 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 23 18:31:24.041751 kubelet[2894]: E0123 18:31:24.041709 2894 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 23 18:31:24.041848 kubelet[2894]: E0123 18:31:24.041814 2894 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qvjpk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-775b766696-sj9gw_calico-system(a7bedb6c-04ad-4dfc-97e0-53467bd29e69): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 23 18:31:24.043186 kubelet[2894]: E0123 18:31:24.043115 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-775b766696-sj9gw" podUID="a7bedb6c-04ad-4dfc-97e0-53467bd29e69" Jan 23 18:31:24.227000 audit: BPF prog-id=180 op=LOAD Jan 23 18:31:24.227000 audit[4229]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffe7de26070 a2=98 a3=1fffffffffffffff items=0 ppid=4131 pid=4229 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:24.227000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 23 18:31:24.227000 audit: BPF prog-id=180 op=UNLOAD Jan 23 18:31:24.227000 audit[4229]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7ffe7de26040 a3=0 items=0 ppid=4131 pid=4229 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:24.227000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 23 18:31:24.227000 audit: BPF prog-id=181 op=LOAD Jan 23 18:31:24.227000 audit[4229]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffe7de25f50 a2=94 a3=3 items=0 ppid=4131 pid=4229 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:24.227000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 23 18:31:24.227000 audit: BPF prog-id=181 op=UNLOAD Jan 23 18:31:24.227000 audit[4229]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7ffe7de25f50 a2=94 a3=3 items=0 ppid=4131 pid=4229 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:24.227000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 23 18:31:24.227000 audit: BPF prog-id=182 op=LOAD Jan 23 18:31:24.227000 audit[4229]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffe7de25f90 a2=94 a3=7ffe7de26170 items=0 ppid=4131 pid=4229 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:24.227000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 23 18:31:24.227000 audit: BPF prog-id=182 op=UNLOAD Jan 23 18:31:24.227000 audit[4229]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7ffe7de25f90 a2=94 a3=7ffe7de26170 items=0 ppid=4131 pid=4229 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:24.227000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 23 18:31:24.229000 audit: BPF prog-id=183 op=LOAD Jan 23 18:31:24.229000 audit[4230]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffd6a9f00d0 a2=98 a3=3 items=0 ppid=4131 pid=4230 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:24.229000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 23 18:31:24.229000 audit: BPF prog-id=183 op=UNLOAD Jan 23 18:31:24.229000 audit[4230]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7ffd6a9f00a0 a3=0 items=0 ppid=4131 pid=4230 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:24.229000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 23 18:31:24.230000 audit: BPF prog-id=184 op=LOAD Jan 23 18:31:24.230000 audit[4230]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffd6a9efec0 a2=94 a3=54428f items=0 ppid=4131 pid=4230 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:24.230000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 23 18:31:24.230000 audit: BPF prog-id=184 op=UNLOAD Jan 23 18:31:24.230000 audit[4230]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffd6a9efec0 a2=94 a3=54428f items=0 ppid=4131 pid=4230 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:24.230000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 23 18:31:24.230000 audit: BPF prog-id=185 op=LOAD Jan 23 18:31:24.230000 audit[4230]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffd6a9efef0 a2=94 a3=2 items=0 ppid=4131 pid=4230 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:24.230000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 23 18:31:24.230000 audit: BPF prog-id=185 op=UNLOAD Jan 23 18:31:24.230000 audit[4230]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffd6a9efef0 a2=0 a3=2 items=0 ppid=4131 pid=4230 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:24.230000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 23 18:31:24.386000 audit: BPF prog-id=186 op=LOAD Jan 23 18:31:24.386000 audit[4230]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffd6a9efdb0 a2=94 a3=1 items=0 ppid=4131 pid=4230 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:24.386000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 23 18:31:24.386000 audit: BPF prog-id=186 op=UNLOAD Jan 23 18:31:24.386000 audit[4230]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffd6a9efdb0 a2=94 a3=1 items=0 ppid=4131 pid=4230 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:24.386000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 23 18:31:24.397000 audit: BPF prog-id=187 op=LOAD Jan 23 18:31:24.397000 audit[4230]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffd6a9efda0 a2=94 a3=4 items=0 ppid=4131 pid=4230 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:24.397000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 23 18:31:24.397000 audit: BPF prog-id=187 op=UNLOAD Jan 23 18:31:24.397000 audit[4230]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=7ffd6a9efda0 a2=0 a3=4 items=0 ppid=4131 pid=4230 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:24.397000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 23 18:31:24.397000 audit: BPF prog-id=188 op=LOAD Jan 23 18:31:24.397000 audit[4230]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffd6a9efc00 a2=94 a3=5 items=0 ppid=4131 pid=4230 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:24.397000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 23 18:31:24.397000 audit: BPF prog-id=188 op=UNLOAD Jan 23 18:31:24.397000 audit[4230]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7ffd6a9efc00 a2=0 a3=5 items=0 ppid=4131 pid=4230 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:24.397000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 23 18:31:24.397000 audit: BPF prog-id=189 op=LOAD Jan 23 18:31:24.397000 audit[4230]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffd6a9efe20 a2=94 a3=6 items=0 ppid=4131 pid=4230 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:24.397000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 23 18:31:24.397000 audit: BPF prog-id=189 op=UNLOAD Jan 23 18:31:24.397000 audit[4230]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=7ffd6a9efe20 a2=0 a3=6 items=0 ppid=4131 pid=4230 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:24.397000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 23 18:31:24.397000 audit: BPF prog-id=190 op=LOAD Jan 23 18:31:24.397000 audit[4230]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffd6a9ef5d0 a2=94 a3=88 items=0 ppid=4131 pid=4230 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:24.397000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 23 18:31:24.398000 audit: BPF prog-id=191 op=LOAD Jan 23 18:31:24.398000 audit[4230]: SYSCALL arch=c000003e syscall=321 success=yes exit=7 a0=5 a1=7ffd6a9ef450 a2=94 a3=2 items=0 ppid=4131 pid=4230 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:24.398000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 23 18:31:24.398000 audit: BPF prog-id=191 op=UNLOAD Jan 23 18:31:24.398000 audit[4230]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=7 a1=7ffd6a9ef480 a2=0 a3=7ffd6a9ef580 items=0 ppid=4131 pid=4230 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:24.398000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 23 18:31:24.398000 audit: BPF prog-id=190 op=UNLOAD Jan 23 18:31:24.398000 audit[4230]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=3dad1d10 a2=0 a3=56265a9e814ef2c7 items=0 ppid=4131 pid=4230 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:24.398000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 23 18:31:24.409000 audit: BPF prog-id=192 op=LOAD Jan 23 18:31:24.409000 audit[4233]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7fffb65b16b0 a2=98 a3=1999999999999999 items=0 ppid=4131 pid=4233 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:24.409000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 23 18:31:24.409000 audit: BPF prog-id=192 op=UNLOAD Jan 23 18:31:24.409000 audit[4233]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7fffb65b1680 a3=0 items=0 ppid=4131 pid=4233 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:24.409000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 23 18:31:24.409000 audit: BPF prog-id=193 op=LOAD Jan 23 18:31:24.409000 audit[4233]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7fffb65b1590 a2=94 a3=ffff items=0 ppid=4131 pid=4233 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:24.409000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 23 18:31:24.409000 audit: BPF prog-id=193 op=UNLOAD Jan 23 18:31:24.409000 audit[4233]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7fffb65b1590 a2=94 a3=ffff items=0 ppid=4131 pid=4233 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:24.409000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 23 18:31:24.409000 audit: BPF prog-id=194 op=LOAD Jan 23 18:31:24.409000 audit[4233]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7fffb65b15d0 a2=94 a3=7fffb65b17b0 items=0 ppid=4131 pid=4233 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:24.409000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 23 18:31:24.409000 audit: BPF prog-id=194 op=UNLOAD Jan 23 18:31:24.409000 audit[4233]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7fffb65b15d0 a2=94 a3=7fffb65b17b0 items=0 ppid=4131 pid=4233 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:24.409000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 23 18:31:24.558793 containerd[1690]: time="2026-01-23T18:31:24.558472652Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-8447c9595f-2lbtc,Uid:be525a6b-06ca-4032-a777-c6e0f1c5eb71,Namespace:calico-system,Attempt:0,}" Jan 23 18:31:24.558793 containerd[1690]: time="2026-01-23T18:31:24.558653988Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-9xqc6,Uid:dad22ee7-a9d6-4858-9e53-0db48fecba12,Namespace:calico-system,Attempt:0,}" Jan 23 18:31:24.761912 kubelet[2894]: E0123 18:31:24.761484 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-775b766696-sj9gw" podUID="a7bedb6c-04ad-4dfc-97e0-53467bd29e69" Jan 23 18:31:24.788081 systemd-networkd[1583]: vxlan.calico: Link UP Jan 23 18:31:24.788089 systemd-networkd[1583]: vxlan.calico: Gained carrier Jan 23 18:31:24.789000 audit[4251]: NETFILTER_CFG table=filter:119 family=2 entries=20 op=nft_register_rule pid=4251 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:31:24.789000 audit[4251]: SYSCALL arch=c000003e syscall=46 success=yes exit=7480 a0=3 a1=7ffebe733880 a2=0 a3=7ffebe73386c items=0 ppid=3032 pid=4251 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:24.789000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:31:24.792000 audit[4251]: NETFILTER_CFG table=nat:120 family=2 entries=14 op=nft_register_rule pid=4251 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:31:24.792000 audit[4251]: SYSCALL arch=c000003e syscall=46 success=yes exit=3468 a0=3 a1=7ffebe733880 a2=0 a3=0 items=0 ppid=3032 pid=4251 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:24.792000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:31:24.817000 audit: BPF prog-id=195 op=LOAD Jan 23 18:31:24.817000 audit[4263]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7fff723a5650 a2=98 a3=0 items=0 ppid=4131 pid=4263 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:24.817000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 23 18:31:24.817000 audit: BPF prog-id=195 op=UNLOAD Jan 23 18:31:24.817000 audit[4263]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7fff723a5620 a3=0 items=0 ppid=4131 pid=4263 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:24.817000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 23 18:31:24.817000 audit: BPF prog-id=196 op=LOAD Jan 23 18:31:24.817000 audit[4263]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7fff723a5460 a2=94 a3=54428f items=0 ppid=4131 pid=4263 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:24.817000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 23 18:31:24.817000 audit: BPF prog-id=196 op=UNLOAD Jan 23 18:31:24.817000 audit[4263]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7fff723a5460 a2=94 a3=54428f items=0 ppid=4131 pid=4263 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:24.817000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 23 18:31:24.817000 audit: BPF prog-id=197 op=LOAD Jan 23 18:31:24.817000 audit[4263]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7fff723a5490 a2=94 a3=2 items=0 ppid=4131 pid=4263 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:24.817000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 23 18:31:24.817000 audit: BPF prog-id=197 op=UNLOAD Jan 23 18:31:24.817000 audit[4263]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7fff723a5490 a2=0 a3=2 items=0 ppid=4131 pid=4263 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:24.817000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 23 18:31:24.817000 audit: BPF prog-id=198 op=LOAD Jan 23 18:31:24.817000 audit[4263]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7fff723a5240 a2=94 a3=4 items=0 ppid=4131 pid=4263 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:24.817000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 23 18:31:24.817000 audit: BPF prog-id=198 op=UNLOAD Jan 23 18:31:24.817000 audit[4263]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7fff723a5240 a2=94 a3=4 items=0 ppid=4131 pid=4263 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:24.817000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 23 18:31:24.817000 audit: BPF prog-id=199 op=LOAD Jan 23 18:31:24.817000 audit[4263]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7fff723a5340 a2=94 a3=7fff723a54c0 items=0 ppid=4131 pid=4263 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:24.817000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 23 18:31:24.817000 audit: BPF prog-id=199 op=UNLOAD Jan 23 18:31:24.817000 audit[4263]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7fff723a5340 a2=0 a3=7fff723a54c0 items=0 ppid=4131 pid=4263 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:24.817000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 23 18:31:24.819000 audit: BPF prog-id=200 op=LOAD Jan 23 18:31:24.819000 audit[4263]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7fff723a4a70 a2=94 a3=2 items=0 ppid=4131 pid=4263 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:24.819000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 23 18:31:24.819000 audit: BPF prog-id=200 op=UNLOAD Jan 23 18:31:24.819000 audit[4263]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7fff723a4a70 a2=0 a3=2 items=0 ppid=4131 pid=4263 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:24.819000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 23 18:31:24.819000 audit: BPF prog-id=201 op=LOAD Jan 23 18:31:24.819000 audit[4263]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7fff723a4b70 a2=94 a3=30 items=0 ppid=4131 pid=4263 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:24.819000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 23 18:31:24.826000 audit: BPF prog-id=202 op=LOAD Jan 23 18:31:24.826000 audit[4266]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffea31bbe50 a2=98 a3=0 items=0 ppid=4131 pid=4266 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:24.826000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 23 18:31:24.826000 audit: BPF prog-id=202 op=UNLOAD Jan 23 18:31:24.826000 audit[4266]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7ffea31bbe20 a3=0 items=0 ppid=4131 pid=4266 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:24.826000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 23 18:31:24.827000 audit: BPF prog-id=203 op=LOAD Jan 23 18:31:24.827000 audit[4266]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffea31bbc40 a2=94 a3=54428f items=0 ppid=4131 pid=4266 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:24.827000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 23 18:31:24.827000 audit: BPF prog-id=203 op=UNLOAD Jan 23 18:31:24.827000 audit[4266]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffea31bbc40 a2=94 a3=54428f items=0 ppid=4131 pid=4266 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:24.827000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 23 18:31:24.827000 audit: BPF prog-id=204 op=LOAD Jan 23 18:31:24.827000 audit[4266]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffea31bbc70 a2=94 a3=2 items=0 ppid=4131 pid=4266 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:24.827000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 23 18:31:24.827000 audit: BPF prog-id=204 op=UNLOAD Jan 23 18:31:24.827000 audit[4266]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffea31bbc70 a2=0 a3=2 items=0 ppid=4131 pid=4266 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:24.827000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 23 18:31:24.869348 systemd-networkd[1583]: caliab5310f6468: Gained IPv6LL Jan 23 18:31:25.052000 audit: BPF prog-id=205 op=LOAD Jan 23 18:31:25.052000 audit[4266]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffea31bbb30 a2=94 a3=1 items=0 ppid=4131 pid=4266 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:25.052000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 23 18:31:25.052000 audit: BPF prog-id=205 op=UNLOAD Jan 23 18:31:25.052000 audit[4266]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffea31bbb30 a2=94 a3=1 items=0 ppid=4131 pid=4266 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:25.052000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 23 18:31:25.064000 audit: BPF prog-id=206 op=LOAD Jan 23 18:31:25.064000 audit[4266]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffea31bbb20 a2=94 a3=4 items=0 ppid=4131 pid=4266 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:25.064000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 23 18:31:25.064000 audit: BPF prog-id=206 op=UNLOAD Jan 23 18:31:25.064000 audit[4266]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=7ffea31bbb20 a2=0 a3=4 items=0 ppid=4131 pid=4266 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:25.064000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 23 18:31:25.064000 audit: BPF prog-id=207 op=LOAD Jan 23 18:31:25.064000 audit[4266]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffea31bb980 a2=94 a3=5 items=0 ppid=4131 pid=4266 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:25.064000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 23 18:31:25.064000 audit: BPF prog-id=207 op=UNLOAD Jan 23 18:31:25.064000 audit[4266]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7ffea31bb980 a2=0 a3=5 items=0 ppid=4131 pid=4266 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:25.064000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 23 18:31:25.064000 audit: BPF prog-id=208 op=LOAD Jan 23 18:31:25.064000 audit[4266]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffea31bbba0 a2=94 a3=6 items=0 ppid=4131 pid=4266 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:25.064000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 23 18:31:25.064000 audit: BPF prog-id=208 op=UNLOAD Jan 23 18:31:25.064000 audit[4266]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=7ffea31bbba0 a2=0 a3=6 items=0 ppid=4131 pid=4266 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:25.064000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 23 18:31:25.064000 audit: BPF prog-id=209 op=LOAD Jan 23 18:31:25.064000 audit[4266]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffea31bb350 a2=94 a3=88 items=0 ppid=4131 pid=4266 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:25.064000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 23 18:31:25.065000 audit: BPF prog-id=210 op=LOAD Jan 23 18:31:25.065000 audit[4266]: SYSCALL arch=c000003e syscall=321 success=yes exit=7 a0=5 a1=7ffea31bb1d0 a2=94 a3=2 items=0 ppid=4131 pid=4266 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:25.065000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 23 18:31:25.065000 audit: BPF prog-id=210 op=UNLOAD Jan 23 18:31:25.065000 audit[4266]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=7 a1=7ffea31bb200 a2=0 a3=7ffea31bb300 items=0 ppid=4131 pid=4266 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:25.065000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 23 18:31:25.065000 audit: BPF prog-id=209 op=UNLOAD Jan 23 18:31:25.065000 audit[4266]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=c9ead10 a2=0 a3=61917a9681a8889a items=0 ppid=4131 pid=4266 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:25.065000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 23 18:31:25.072000 audit: BPF prog-id=201 op=UNLOAD Jan 23 18:31:25.072000 audit[4131]: SYSCALL arch=c000003e syscall=263 success=yes exit=0 a0=ffffffffffffff9c a1=c0006a4580 a2=0 a3=0 items=0 ppid=4111 pid=4131 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="calico-node" exe="/usr/bin/calico-node" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:25.072000 audit: PROCTITLE proctitle=63616C69636F2D6E6F6465002D66656C6978 Jan 23 18:31:25.377830 containerd[1690]: time="2026-01-23T18:31:25.376987282Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-779b7ffd49-78hkk,Uid:644e9aff-fbfa-4d8d-bb83-94a0bb426243,Namespace:calico-apiserver,Attempt:0,}" Jan 23 18:31:25.501153 systemd-networkd[1583]: cali4e656fde26b: Link UP Jan 23 18:31:25.503138 systemd-networkd[1583]: cali4e656fde26b: Gained carrier Jan 23 18:31:25.519825 containerd[1690]: 2026-01-23 18:31:25.138 [INFO][4276] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--1--0--2--32611d5cc2-k8s-goldmane--666569f655--9xqc6-eth0 goldmane-666569f655- calico-system dad22ee7-a9d6-4858-9e53-0db48fecba12 813 0 2026-01-23 18:30:58 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:666569f655 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4547-1-0-2-32611d5cc2 goldmane-666569f655-9xqc6 eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali4e656fde26b [] [] }} ContainerID="b4264068514b98efa75fc93d451111505ee62f588bbf4dfd6eb791396c938d65" Namespace="calico-system" Pod="goldmane-666569f655-9xqc6" WorkloadEndpoint="ci--4547--1--0--2--32611d5cc2-k8s-goldmane--666569f655--9xqc6-" Jan 23 18:31:25.519825 containerd[1690]: 2026-01-23 18:31:25.138 [INFO][4276] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="b4264068514b98efa75fc93d451111505ee62f588bbf4dfd6eb791396c938d65" Namespace="calico-system" Pod="goldmane-666569f655-9xqc6" WorkloadEndpoint="ci--4547--1--0--2--32611d5cc2-k8s-goldmane--666569f655--9xqc6-eth0" Jan 23 18:31:25.519825 containerd[1690]: 2026-01-23 18:31:25.172 [INFO][4294] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b4264068514b98efa75fc93d451111505ee62f588bbf4dfd6eb791396c938d65" HandleID="k8s-pod-network.b4264068514b98efa75fc93d451111505ee62f588bbf4dfd6eb791396c938d65" Workload="ci--4547--1--0--2--32611d5cc2-k8s-goldmane--666569f655--9xqc6-eth0" Jan 23 18:31:25.520008 containerd[1690]: 2026-01-23 18:31:25.172 [INFO][4294] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="b4264068514b98efa75fc93d451111505ee62f588bbf4dfd6eb791396c938d65" HandleID="k8s-pod-network.b4264068514b98efa75fc93d451111505ee62f588bbf4dfd6eb791396c938d65" Workload="ci--4547--1--0--2--32611d5cc2-k8s-goldmane--666569f655--9xqc6-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f590), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4547-1-0-2-32611d5cc2", "pod":"goldmane-666569f655-9xqc6", "timestamp":"2026-01-23 18:31:25.17255826 +0000 UTC"}, Hostname:"ci-4547-1-0-2-32611d5cc2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 23 18:31:25.520008 containerd[1690]: 2026-01-23 18:31:25.172 [INFO][4294] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 23 18:31:25.520008 containerd[1690]: 2026-01-23 18:31:25.172 [INFO][4294] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 23 18:31:25.520008 containerd[1690]: 2026-01-23 18:31:25.172 [INFO][4294] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-1-0-2-32611d5cc2' Jan 23 18:31:25.520008 containerd[1690]: 2026-01-23 18:31:25.180 [INFO][4294] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.b4264068514b98efa75fc93d451111505ee62f588bbf4dfd6eb791396c938d65" host="ci-4547-1-0-2-32611d5cc2" Jan 23 18:31:25.520008 containerd[1690]: 2026-01-23 18:31:25.184 [INFO][4294] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-1-0-2-32611d5cc2" Jan 23 18:31:25.520008 containerd[1690]: 2026-01-23 18:31:25.196 [INFO][4294] ipam/ipam.go 511: Trying affinity for 192.168.28.128/26 host="ci-4547-1-0-2-32611d5cc2" Jan 23 18:31:25.520008 containerd[1690]: 2026-01-23 18:31:25.202 [INFO][4294] ipam/ipam.go 158: Attempting to load block cidr=192.168.28.128/26 host="ci-4547-1-0-2-32611d5cc2" Jan 23 18:31:25.520008 containerd[1690]: 2026-01-23 18:31:25.207 [INFO][4294] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.28.128/26 host="ci-4547-1-0-2-32611d5cc2" Jan 23 18:31:25.520184 containerd[1690]: 2026-01-23 18:31:25.207 [INFO][4294] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.28.128/26 handle="k8s-pod-network.b4264068514b98efa75fc93d451111505ee62f588bbf4dfd6eb791396c938d65" host="ci-4547-1-0-2-32611d5cc2" Jan 23 18:31:25.520184 containerd[1690]: 2026-01-23 18:31:25.210 [INFO][4294] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.b4264068514b98efa75fc93d451111505ee62f588bbf4dfd6eb791396c938d65 Jan 23 18:31:25.520184 containerd[1690]: 2026-01-23 18:31:25.217 [INFO][4294] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.28.128/26 handle="k8s-pod-network.b4264068514b98efa75fc93d451111505ee62f588bbf4dfd6eb791396c938d65" host="ci-4547-1-0-2-32611d5cc2" Jan 23 18:31:25.520184 containerd[1690]: 2026-01-23 18:31:25.224 [INFO][4294] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.28.130/26] block=192.168.28.128/26 handle="k8s-pod-network.b4264068514b98efa75fc93d451111505ee62f588bbf4dfd6eb791396c938d65" host="ci-4547-1-0-2-32611d5cc2" Jan 23 18:31:25.520184 containerd[1690]: 2026-01-23 18:31:25.224 [INFO][4294] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.28.130/26] handle="k8s-pod-network.b4264068514b98efa75fc93d451111505ee62f588bbf4dfd6eb791396c938d65" host="ci-4547-1-0-2-32611d5cc2" Jan 23 18:31:25.520184 containerd[1690]: 2026-01-23 18:31:25.224 [INFO][4294] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 23 18:31:25.520184 containerd[1690]: 2026-01-23 18:31:25.224 [INFO][4294] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.28.130/26] IPv6=[] ContainerID="b4264068514b98efa75fc93d451111505ee62f588bbf4dfd6eb791396c938d65" HandleID="k8s-pod-network.b4264068514b98efa75fc93d451111505ee62f588bbf4dfd6eb791396c938d65" Workload="ci--4547--1--0--2--32611d5cc2-k8s-goldmane--666569f655--9xqc6-eth0" Jan 23 18:31:25.520323 containerd[1690]: 2026-01-23 18:31:25.228 [INFO][4276] cni-plugin/k8s.go 418: Populated endpoint ContainerID="b4264068514b98efa75fc93d451111505ee62f588bbf4dfd6eb791396c938d65" Namespace="calico-system" Pod="goldmane-666569f655-9xqc6" WorkloadEndpoint="ci--4547--1--0--2--32611d5cc2-k8s-goldmane--666569f655--9xqc6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--1--0--2--32611d5cc2-k8s-goldmane--666569f655--9xqc6-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"dad22ee7-a9d6-4858-9e53-0db48fecba12", ResourceVersion:"813", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 18, 30, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-1-0-2-32611d5cc2", ContainerID:"", Pod:"goldmane-666569f655-9xqc6", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.28.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali4e656fde26b", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 18:31:25.520372 containerd[1690]: 2026-01-23 18:31:25.228 [INFO][4276] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.28.130/32] ContainerID="b4264068514b98efa75fc93d451111505ee62f588bbf4dfd6eb791396c938d65" Namespace="calico-system" Pod="goldmane-666569f655-9xqc6" WorkloadEndpoint="ci--4547--1--0--2--32611d5cc2-k8s-goldmane--666569f655--9xqc6-eth0" Jan 23 18:31:25.520372 containerd[1690]: 2026-01-23 18:31:25.229 [INFO][4276] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali4e656fde26b ContainerID="b4264068514b98efa75fc93d451111505ee62f588bbf4dfd6eb791396c938d65" Namespace="calico-system" Pod="goldmane-666569f655-9xqc6" WorkloadEndpoint="ci--4547--1--0--2--32611d5cc2-k8s-goldmane--666569f655--9xqc6-eth0" Jan 23 18:31:25.520372 containerd[1690]: 2026-01-23 18:31:25.501 [INFO][4276] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b4264068514b98efa75fc93d451111505ee62f588bbf4dfd6eb791396c938d65" Namespace="calico-system" Pod="goldmane-666569f655-9xqc6" WorkloadEndpoint="ci--4547--1--0--2--32611d5cc2-k8s-goldmane--666569f655--9xqc6-eth0" Jan 23 18:31:25.520425 containerd[1690]: 2026-01-23 18:31:25.502 [INFO][4276] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="b4264068514b98efa75fc93d451111505ee62f588bbf4dfd6eb791396c938d65" Namespace="calico-system" Pod="goldmane-666569f655-9xqc6" WorkloadEndpoint="ci--4547--1--0--2--32611d5cc2-k8s-goldmane--666569f655--9xqc6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--1--0--2--32611d5cc2-k8s-goldmane--666569f655--9xqc6-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"dad22ee7-a9d6-4858-9e53-0db48fecba12", ResourceVersion:"813", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 18, 30, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-1-0-2-32611d5cc2", ContainerID:"b4264068514b98efa75fc93d451111505ee62f588bbf4dfd6eb791396c938d65", Pod:"goldmane-666569f655-9xqc6", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.28.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali4e656fde26b", MAC:"a2:5e:ac:21:6e:d8", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 18:31:25.520473 containerd[1690]: 2026-01-23 18:31:25.517 [INFO][4276] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="b4264068514b98efa75fc93d451111505ee62f588bbf4dfd6eb791396c938d65" Namespace="calico-system" Pod="goldmane-666569f655-9xqc6" WorkloadEndpoint="ci--4547--1--0--2--32611d5cc2-k8s-goldmane--666569f655--9xqc6-eth0" Jan 23 18:31:25.580011 systemd-networkd[1583]: cali2decfde2beb: Link UP Jan 23 18:31:25.581072 systemd-networkd[1583]: cali2decfde2beb: Gained carrier Jan 23 18:31:25.599401 containerd[1690]: 2026-01-23 18:31:25.506 [INFO][4272] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--1--0--2--32611d5cc2-k8s-calico--kube--controllers--8447c9595f--2lbtc-eth0 calico-kube-controllers-8447c9595f- calico-system be525a6b-06ca-4032-a777-c6e0f1c5eb71 811 0 2026-01-23 18:31:00 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:8447c9595f projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4547-1-0-2-32611d5cc2 calico-kube-controllers-8447c9595f-2lbtc eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali2decfde2beb [] [] }} ContainerID="71ba4566ff4fda1cddf4bb5fb2588cd03d6f148bc6c181c2a9b5420ecbf8ef24" Namespace="calico-system" Pod="calico-kube-controllers-8447c9595f-2lbtc" WorkloadEndpoint="ci--4547--1--0--2--32611d5cc2-k8s-calico--kube--controllers--8447c9595f--2lbtc-" Jan 23 18:31:25.599401 containerd[1690]: 2026-01-23 18:31:25.506 [INFO][4272] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="71ba4566ff4fda1cddf4bb5fb2588cd03d6f148bc6c181c2a9b5420ecbf8ef24" Namespace="calico-system" Pod="calico-kube-controllers-8447c9595f-2lbtc" WorkloadEndpoint="ci--4547--1--0--2--32611d5cc2-k8s-calico--kube--controllers--8447c9595f--2lbtc-eth0" Jan 23 18:31:25.599401 containerd[1690]: 2026-01-23 18:31:25.540 [INFO][4313] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="71ba4566ff4fda1cddf4bb5fb2588cd03d6f148bc6c181c2a9b5420ecbf8ef24" HandleID="k8s-pod-network.71ba4566ff4fda1cddf4bb5fb2588cd03d6f148bc6c181c2a9b5420ecbf8ef24" Workload="ci--4547--1--0--2--32611d5cc2-k8s-calico--kube--controllers--8447c9595f--2lbtc-eth0" Jan 23 18:31:25.599747 containerd[1690]: 2026-01-23 18:31:25.540 [INFO][4313] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="71ba4566ff4fda1cddf4bb5fb2588cd03d6f148bc6c181c2a9b5420ecbf8ef24" HandleID="k8s-pod-network.71ba4566ff4fda1cddf4bb5fb2588cd03d6f148bc6c181c2a9b5420ecbf8ef24" Workload="ci--4547--1--0--2--32611d5cc2-k8s-calico--kube--controllers--8447c9595f--2lbtc-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d55e0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4547-1-0-2-32611d5cc2", "pod":"calico-kube-controllers-8447c9595f-2lbtc", "timestamp":"2026-01-23 18:31:25.540204943 +0000 UTC"}, Hostname:"ci-4547-1-0-2-32611d5cc2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 23 18:31:25.599747 containerd[1690]: 2026-01-23 18:31:25.540 [INFO][4313] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 23 18:31:25.599747 containerd[1690]: 2026-01-23 18:31:25.540 [INFO][4313] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 23 18:31:25.599747 containerd[1690]: 2026-01-23 18:31:25.540 [INFO][4313] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-1-0-2-32611d5cc2' Jan 23 18:31:25.599747 containerd[1690]: 2026-01-23 18:31:25.550 [INFO][4313] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.71ba4566ff4fda1cddf4bb5fb2588cd03d6f148bc6c181c2a9b5420ecbf8ef24" host="ci-4547-1-0-2-32611d5cc2" Jan 23 18:31:25.599747 containerd[1690]: 2026-01-23 18:31:25.555 [INFO][4313] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-1-0-2-32611d5cc2" Jan 23 18:31:25.599747 containerd[1690]: 2026-01-23 18:31:25.560 [INFO][4313] ipam/ipam.go 511: Trying affinity for 192.168.28.128/26 host="ci-4547-1-0-2-32611d5cc2" Jan 23 18:31:25.599747 containerd[1690]: 2026-01-23 18:31:25.561 [INFO][4313] ipam/ipam.go 158: Attempting to load block cidr=192.168.28.128/26 host="ci-4547-1-0-2-32611d5cc2" Jan 23 18:31:25.599747 containerd[1690]: 2026-01-23 18:31:25.564 [INFO][4313] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.28.128/26 host="ci-4547-1-0-2-32611d5cc2" Jan 23 18:31:25.599971 containerd[1690]: 2026-01-23 18:31:25.564 [INFO][4313] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.28.128/26 handle="k8s-pod-network.71ba4566ff4fda1cddf4bb5fb2588cd03d6f148bc6c181c2a9b5420ecbf8ef24" host="ci-4547-1-0-2-32611d5cc2" Jan 23 18:31:25.599971 containerd[1690]: 2026-01-23 18:31:25.565 [INFO][4313] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.71ba4566ff4fda1cddf4bb5fb2588cd03d6f148bc6c181c2a9b5420ecbf8ef24 Jan 23 18:31:25.599971 containerd[1690]: 2026-01-23 18:31:25.569 [INFO][4313] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.28.128/26 handle="k8s-pod-network.71ba4566ff4fda1cddf4bb5fb2588cd03d6f148bc6c181c2a9b5420ecbf8ef24" host="ci-4547-1-0-2-32611d5cc2" Jan 23 18:31:25.599971 containerd[1690]: 2026-01-23 18:31:25.575 [INFO][4313] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.28.131/26] block=192.168.28.128/26 handle="k8s-pod-network.71ba4566ff4fda1cddf4bb5fb2588cd03d6f148bc6c181c2a9b5420ecbf8ef24" host="ci-4547-1-0-2-32611d5cc2" Jan 23 18:31:25.599971 containerd[1690]: 2026-01-23 18:31:25.575 [INFO][4313] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.28.131/26] handle="k8s-pod-network.71ba4566ff4fda1cddf4bb5fb2588cd03d6f148bc6c181c2a9b5420ecbf8ef24" host="ci-4547-1-0-2-32611d5cc2" Jan 23 18:31:25.599971 containerd[1690]: 2026-01-23 18:31:25.575 [INFO][4313] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 23 18:31:25.599971 containerd[1690]: 2026-01-23 18:31:25.575 [INFO][4313] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.28.131/26] IPv6=[] ContainerID="71ba4566ff4fda1cddf4bb5fb2588cd03d6f148bc6c181c2a9b5420ecbf8ef24" HandleID="k8s-pod-network.71ba4566ff4fda1cddf4bb5fb2588cd03d6f148bc6c181c2a9b5420ecbf8ef24" Workload="ci--4547--1--0--2--32611d5cc2-k8s-calico--kube--controllers--8447c9595f--2lbtc-eth0" Jan 23 18:31:25.600100 containerd[1690]: 2026-01-23 18:31:25.577 [INFO][4272] cni-plugin/k8s.go 418: Populated endpoint ContainerID="71ba4566ff4fda1cddf4bb5fb2588cd03d6f148bc6c181c2a9b5420ecbf8ef24" Namespace="calico-system" Pod="calico-kube-controllers-8447c9595f-2lbtc" WorkloadEndpoint="ci--4547--1--0--2--32611d5cc2-k8s-calico--kube--controllers--8447c9595f--2lbtc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--1--0--2--32611d5cc2-k8s-calico--kube--controllers--8447c9595f--2lbtc-eth0", GenerateName:"calico-kube-controllers-8447c9595f-", Namespace:"calico-system", SelfLink:"", UID:"be525a6b-06ca-4032-a777-c6e0f1c5eb71", ResourceVersion:"811", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 18, 31, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"8447c9595f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-1-0-2-32611d5cc2", ContainerID:"", Pod:"calico-kube-controllers-8447c9595f-2lbtc", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.28.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali2decfde2beb", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 18:31:25.600153 containerd[1690]: 2026-01-23 18:31:25.577 [INFO][4272] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.28.131/32] ContainerID="71ba4566ff4fda1cddf4bb5fb2588cd03d6f148bc6c181c2a9b5420ecbf8ef24" Namespace="calico-system" Pod="calico-kube-controllers-8447c9595f-2lbtc" WorkloadEndpoint="ci--4547--1--0--2--32611d5cc2-k8s-calico--kube--controllers--8447c9595f--2lbtc-eth0" Jan 23 18:31:25.600153 containerd[1690]: 2026-01-23 18:31:25.577 [INFO][4272] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali2decfde2beb ContainerID="71ba4566ff4fda1cddf4bb5fb2588cd03d6f148bc6c181c2a9b5420ecbf8ef24" Namespace="calico-system" Pod="calico-kube-controllers-8447c9595f-2lbtc" WorkloadEndpoint="ci--4547--1--0--2--32611d5cc2-k8s-calico--kube--controllers--8447c9595f--2lbtc-eth0" Jan 23 18:31:25.600153 containerd[1690]: 2026-01-23 18:31:25.581 [INFO][4272] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="71ba4566ff4fda1cddf4bb5fb2588cd03d6f148bc6c181c2a9b5420ecbf8ef24" Namespace="calico-system" Pod="calico-kube-controllers-8447c9595f-2lbtc" WorkloadEndpoint="ci--4547--1--0--2--32611d5cc2-k8s-calico--kube--controllers--8447c9595f--2lbtc-eth0" Jan 23 18:31:25.600210 containerd[1690]: 2026-01-23 18:31:25.582 [INFO][4272] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="71ba4566ff4fda1cddf4bb5fb2588cd03d6f148bc6c181c2a9b5420ecbf8ef24" Namespace="calico-system" Pod="calico-kube-controllers-8447c9595f-2lbtc" WorkloadEndpoint="ci--4547--1--0--2--32611d5cc2-k8s-calico--kube--controllers--8447c9595f--2lbtc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--1--0--2--32611d5cc2-k8s-calico--kube--controllers--8447c9595f--2lbtc-eth0", GenerateName:"calico-kube-controllers-8447c9595f-", Namespace:"calico-system", SelfLink:"", UID:"be525a6b-06ca-4032-a777-c6e0f1c5eb71", ResourceVersion:"811", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 18, 31, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"8447c9595f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-1-0-2-32611d5cc2", ContainerID:"71ba4566ff4fda1cddf4bb5fb2588cd03d6f148bc6c181c2a9b5420ecbf8ef24", Pod:"calico-kube-controllers-8447c9595f-2lbtc", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.28.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali2decfde2beb", MAC:"5a:65:cb:99:9e:fd", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 18:31:25.600275 containerd[1690]: 2026-01-23 18:31:25.595 [INFO][4272] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="71ba4566ff4fda1cddf4bb5fb2588cd03d6f148bc6c181c2a9b5420ecbf8ef24" Namespace="calico-system" Pod="calico-kube-controllers-8447c9595f-2lbtc" WorkloadEndpoint="ci--4547--1--0--2--32611d5cc2-k8s-calico--kube--controllers--8447c9595f--2lbtc-eth0" Jan 23 18:31:25.847478 containerd[1690]: time="2026-01-23T18:31:25.847283921Z" level=info msg="connecting to shim 71ba4566ff4fda1cddf4bb5fb2588cd03d6f148bc6c181c2a9b5420ecbf8ef24" address="unix:///run/containerd/s/ff25c74f2da809bb3b0959c759a289ad3804097d22c26e5aa3cd8fb54c567866" namespace=k8s.io protocol=ttrpc version=3 Jan 23 18:31:25.852849 containerd[1690]: time="2026-01-23T18:31:25.852390766Z" level=info msg="connecting to shim b4264068514b98efa75fc93d451111505ee62f588bbf4dfd6eb791396c938d65" address="unix:///run/containerd/s/19ec04c8c9e52548d1650ab0a01df81c69f4f0ec377ef4a37b9946d8336dbc28" namespace=k8s.io protocol=ttrpc version=3 Jan 23 18:31:25.867000 audit[4369]: NETFILTER_CFG table=nat:121 family=2 entries=15 op=nft_register_chain pid=4369 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 23 18:31:25.867000 audit[4369]: SYSCALL arch=c000003e syscall=46 success=yes exit=5084 a0=3 a1=7ffcd0790880 a2=0 a3=7ffcd079086c items=0 ppid=4131 pid=4369 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:25.867000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 23 18:31:25.888434 systemd[1]: Started cri-containerd-b4264068514b98efa75fc93d451111505ee62f588bbf4dfd6eb791396c938d65.scope - libcontainer container b4264068514b98efa75fc93d451111505ee62f588bbf4dfd6eb791396c938d65. Jan 23 18:31:25.894980 systemd[1]: Started cri-containerd-71ba4566ff4fda1cddf4bb5fb2588cd03d6f148bc6c181c2a9b5420ecbf8ef24.scope - libcontainer container 71ba4566ff4fda1cddf4bb5fb2588cd03d6f148bc6c181c2a9b5420ecbf8ef24. Jan 23 18:31:25.919000 audit: BPF prog-id=211 op=LOAD Jan 23 18:31:25.921000 audit: BPF prog-id=212 op=LOAD Jan 23 18:31:25.921000 audit[4391]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=4354 pid=4391 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:25.921000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3731626134353636666634666461316364646634626235666232353838 Jan 23 18:31:25.921000 audit: BPF prog-id=212 op=UNLOAD Jan 23 18:31:25.921000 audit[4391]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4354 pid=4391 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:25.921000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3731626134353636666634666461316364646634626235666232353838 Jan 23 18:31:25.921000 audit: BPF prog-id=213 op=LOAD Jan 23 18:31:25.921000 audit[4391]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=4354 pid=4391 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:25.921000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3731626134353636666634666461316364646634626235666232353838 Jan 23 18:31:25.921000 audit: BPF prog-id=214 op=LOAD Jan 23 18:31:25.921000 audit[4391]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=4354 pid=4391 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:25.921000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3731626134353636666634666461316364646634626235666232353838 Jan 23 18:31:25.921000 audit: BPF prog-id=214 op=UNLOAD Jan 23 18:31:25.921000 audit[4391]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4354 pid=4391 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:25.921000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3731626134353636666634666461316364646634626235666232353838 Jan 23 18:31:25.921000 audit: BPF prog-id=213 op=UNLOAD Jan 23 18:31:25.921000 audit[4391]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4354 pid=4391 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:25.921000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3731626134353636666634666461316364646634626235666232353838 Jan 23 18:31:25.921000 audit: BPF prog-id=215 op=LOAD Jan 23 18:31:25.921000 audit[4391]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=4354 pid=4391 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:25.921000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3731626134353636666634666461316364646634626235666232353838 Jan 23 18:31:25.923000 audit[4429]: NETFILTER_CFG table=mangle:122 family=2 entries=16 op=nft_register_chain pid=4429 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 23 18:31:25.923000 audit[4429]: SYSCALL arch=c000003e syscall=46 success=yes exit=6868 a0=3 a1=7ffda3e7c290 a2=0 a3=7ffda3e7c27c items=0 ppid=4131 pid=4429 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:25.923000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 23 18:31:25.924000 audit: BPF prog-id=216 op=LOAD Jan 23 18:31:25.924000 audit: BPF prog-id=217 op=LOAD Jan 23 18:31:25.924000 audit[4379]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=4363 pid=4379 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:25.924000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6234323634303638353134623938656661373566633933643435313131 Jan 23 18:31:25.925000 audit: BPF prog-id=217 op=UNLOAD Jan 23 18:31:25.925000 audit[4379]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4363 pid=4379 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:25.925000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6234323634303638353134623938656661373566633933643435313131 Jan 23 18:31:25.925000 audit: BPF prog-id=218 op=LOAD Jan 23 18:31:25.925000 audit[4379]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=4363 pid=4379 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:25.925000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6234323634303638353134623938656661373566633933643435313131 Jan 23 18:31:25.925000 audit: BPF prog-id=219 op=LOAD Jan 23 18:31:25.925000 audit[4379]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=4363 pid=4379 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:25.925000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6234323634303638353134623938656661373566633933643435313131 Jan 23 18:31:25.926000 audit: BPF prog-id=219 op=UNLOAD Jan 23 18:31:25.926000 audit[4379]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=4363 pid=4379 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:25.926000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6234323634303638353134623938656661373566633933643435313131 Jan 23 18:31:25.926000 audit: BPF prog-id=218 op=UNLOAD Jan 23 18:31:25.926000 audit[4379]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4363 pid=4379 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:25.926000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6234323634303638353134623938656661373566633933643435313131 Jan 23 18:31:25.926000 audit: BPF prog-id=220 op=LOAD Jan 23 18:31:25.926000 audit[4379]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=4363 pid=4379 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:25.926000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6234323634303638353134623938656661373566633933643435313131 Jan 23 18:31:25.928000 audit[4398]: NETFILTER_CFG table=filter:123 family=2 entries=94 op=nft_register_chain pid=4398 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 23 18:31:25.928000 audit[4398]: SYSCALL arch=c000003e syscall=46 success=yes exit=53116 a0=3 a1=7ffe45f836a0 a2=0 a3=55fa98a6e000 items=0 ppid=4131 pid=4398 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:25.928000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 23 18:31:25.982000 audit[4393]: NETFILTER_CFG table=raw:124 family=2 entries=21 op=nft_register_chain pid=4393 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 23 18:31:25.982000 audit[4393]: SYSCALL arch=c000003e syscall=46 success=yes exit=8452 a0=3 a1=7ffceae31440 a2=0 a3=7ffceae3142c items=0 ppid=4131 pid=4393 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:25.982000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 23 18:31:26.025000 audit[4460]: NETFILTER_CFG table=filter:125 family=2 entries=72 op=nft_register_chain pid=4460 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 23 18:31:26.025000 audit[4460]: SYSCALL arch=c000003e syscall=46 success=yes exit=41856 a0=3 a1=7ffe9abcf0d0 a2=0 a3=7ffe9abcf0bc items=0 ppid=4131 pid=4460 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:26.025000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 23 18:31:26.034760 containerd[1690]: time="2026-01-23T18:31:26.034704413Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-8447c9595f-2lbtc,Uid:be525a6b-06ca-4032-a777-c6e0f1c5eb71,Namespace:calico-system,Attempt:0,} returns sandbox id \"71ba4566ff4fda1cddf4bb5fb2588cd03d6f148bc6c181c2a9b5420ecbf8ef24\"" Jan 23 18:31:26.037208 containerd[1690]: time="2026-01-23T18:31:26.037152814Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 23 18:31:26.042787 systemd-networkd[1583]: cali76f9233f2ba: Link UP Jan 23 18:31:26.043584 systemd-networkd[1583]: cali76f9233f2ba: Gained carrier Jan 23 18:31:26.061314 containerd[1690]: 2026-01-23 18:31:25.915 [INFO][4332] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--1--0--2--32611d5cc2-k8s-calico--apiserver--779b7ffd49--78hkk-eth0 calico-apiserver-779b7ffd49- calico-apiserver 644e9aff-fbfa-4d8d-bb83-94a0bb426243 802 0 2026-01-23 18:30:54 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:779b7ffd49 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4547-1-0-2-32611d5cc2 calico-apiserver-779b7ffd49-78hkk eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali76f9233f2ba [] [] }} ContainerID="da0166878a4856e4f3b80a6347401663103947682caa6efb7785def01a046561" Namespace="calico-apiserver" Pod="calico-apiserver-779b7ffd49-78hkk" WorkloadEndpoint="ci--4547--1--0--2--32611d5cc2-k8s-calico--apiserver--779b7ffd49--78hkk-" Jan 23 18:31:26.061314 containerd[1690]: 2026-01-23 18:31:25.920 [INFO][4332] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="da0166878a4856e4f3b80a6347401663103947682caa6efb7785def01a046561" Namespace="calico-apiserver" Pod="calico-apiserver-779b7ffd49-78hkk" WorkloadEndpoint="ci--4547--1--0--2--32611d5cc2-k8s-calico--apiserver--779b7ffd49--78hkk-eth0" Jan 23 18:31:26.061314 containerd[1690]: 2026-01-23 18:31:25.974 [INFO][4438] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="da0166878a4856e4f3b80a6347401663103947682caa6efb7785def01a046561" HandleID="k8s-pod-network.da0166878a4856e4f3b80a6347401663103947682caa6efb7785def01a046561" Workload="ci--4547--1--0--2--32611d5cc2-k8s-calico--apiserver--779b7ffd49--78hkk-eth0" Jan 23 18:31:26.065584 containerd[1690]: 2026-01-23 18:31:25.974 [INFO][4438] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="da0166878a4856e4f3b80a6347401663103947682caa6efb7785def01a046561" HandleID="k8s-pod-network.da0166878a4856e4f3b80a6347401663103947682caa6efb7785def01a046561" Workload="ci--4547--1--0--2--32611d5cc2-k8s-calico--apiserver--779b7ffd49--78hkk-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f590), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4547-1-0-2-32611d5cc2", "pod":"calico-apiserver-779b7ffd49-78hkk", "timestamp":"2026-01-23 18:31:25.974062319 +0000 UTC"}, Hostname:"ci-4547-1-0-2-32611d5cc2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 23 18:31:26.065584 containerd[1690]: 2026-01-23 18:31:25.974 [INFO][4438] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 23 18:31:26.065584 containerd[1690]: 2026-01-23 18:31:25.974 [INFO][4438] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 23 18:31:26.065584 containerd[1690]: 2026-01-23 18:31:25.974 [INFO][4438] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-1-0-2-32611d5cc2' Jan 23 18:31:26.065584 containerd[1690]: 2026-01-23 18:31:25.982 [INFO][4438] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.da0166878a4856e4f3b80a6347401663103947682caa6efb7785def01a046561" host="ci-4547-1-0-2-32611d5cc2" Jan 23 18:31:26.065584 containerd[1690]: 2026-01-23 18:31:26.002 [INFO][4438] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-1-0-2-32611d5cc2" Jan 23 18:31:26.065584 containerd[1690]: 2026-01-23 18:31:26.011 [INFO][4438] ipam/ipam.go 511: Trying affinity for 192.168.28.128/26 host="ci-4547-1-0-2-32611d5cc2" Jan 23 18:31:26.065584 containerd[1690]: 2026-01-23 18:31:26.013 [INFO][4438] ipam/ipam.go 158: Attempting to load block cidr=192.168.28.128/26 host="ci-4547-1-0-2-32611d5cc2" Jan 23 18:31:26.065584 containerd[1690]: 2026-01-23 18:31:26.016 [INFO][4438] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.28.128/26 host="ci-4547-1-0-2-32611d5cc2" Jan 23 18:31:26.065796 containerd[1690]: 2026-01-23 18:31:26.016 [INFO][4438] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.28.128/26 handle="k8s-pod-network.da0166878a4856e4f3b80a6347401663103947682caa6efb7785def01a046561" host="ci-4547-1-0-2-32611d5cc2" Jan 23 18:31:26.065796 containerd[1690]: 2026-01-23 18:31:26.020 [INFO][4438] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.da0166878a4856e4f3b80a6347401663103947682caa6efb7785def01a046561 Jan 23 18:31:26.065796 containerd[1690]: 2026-01-23 18:31:26.025 [INFO][4438] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.28.128/26 handle="k8s-pod-network.da0166878a4856e4f3b80a6347401663103947682caa6efb7785def01a046561" host="ci-4547-1-0-2-32611d5cc2" Jan 23 18:31:26.065796 containerd[1690]: 2026-01-23 18:31:26.032 [INFO][4438] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.28.132/26] block=192.168.28.128/26 handle="k8s-pod-network.da0166878a4856e4f3b80a6347401663103947682caa6efb7785def01a046561" host="ci-4547-1-0-2-32611d5cc2" Jan 23 18:31:26.065796 containerd[1690]: 2026-01-23 18:31:26.032 [INFO][4438] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.28.132/26] handle="k8s-pod-network.da0166878a4856e4f3b80a6347401663103947682caa6efb7785def01a046561" host="ci-4547-1-0-2-32611d5cc2" Jan 23 18:31:26.065796 containerd[1690]: 2026-01-23 18:31:26.032 [INFO][4438] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 23 18:31:26.065796 containerd[1690]: 2026-01-23 18:31:26.032 [INFO][4438] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.28.132/26] IPv6=[] ContainerID="da0166878a4856e4f3b80a6347401663103947682caa6efb7785def01a046561" HandleID="k8s-pod-network.da0166878a4856e4f3b80a6347401663103947682caa6efb7785def01a046561" Workload="ci--4547--1--0--2--32611d5cc2-k8s-calico--apiserver--779b7ffd49--78hkk-eth0" Jan 23 18:31:26.065925 containerd[1690]: 2026-01-23 18:31:26.037 [INFO][4332] cni-plugin/k8s.go 418: Populated endpoint ContainerID="da0166878a4856e4f3b80a6347401663103947682caa6efb7785def01a046561" Namespace="calico-apiserver" Pod="calico-apiserver-779b7ffd49-78hkk" WorkloadEndpoint="ci--4547--1--0--2--32611d5cc2-k8s-calico--apiserver--779b7ffd49--78hkk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--1--0--2--32611d5cc2-k8s-calico--apiserver--779b7ffd49--78hkk-eth0", GenerateName:"calico-apiserver-779b7ffd49-", Namespace:"calico-apiserver", SelfLink:"", UID:"644e9aff-fbfa-4d8d-bb83-94a0bb426243", ResourceVersion:"802", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 18, 30, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"779b7ffd49", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-1-0-2-32611d5cc2", ContainerID:"", Pod:"calico-apiserver-779b7ffd49-78hkk", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.28.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali76f9233f2ba", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 18:31:26.065979 containerd[1690]: 2026-01-23 18:31:26.038 [INFO][4332] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.28.132/32] ContainerID="da0166878a4856e4f3b80a6347401663103947682caa6efb7785def01a046561" Namespace="calico-apiserver" Pod="calico-apiserver-779b7ffd49-78hkk" WorkloadEndpoint="ci--4547--1--0--2--32611d5cc2-k8s-calico--apiserver--779b7ffd49--78hkk-eth0" Jan 23 18:31:26.065979 containerd[1690]: 2026-01-23 18:31:26.038 [INFO][4332] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali76f9233f2ba ContainerID="da0166878a4856e4f3b80a6347401663103947682caa6efb7785def01a046561" Namespace="calico-apiserver" Pod="calico-apiserver-779b7ffd49-78hkk" WorkloadEndpoint="ci--4547--1--0--2--32611d5cc2-k8s-calico--apiserver--779b7ffd49--78hkk-eth0" Jan 23 18:31:26.065979 containerd[1690]: 2026-01-23 18:31:26.044 [INFO][4332] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="da0166878a4856e4f3b80a6347401663103947682caa6efb7785def01a046561" Namespace="calico-apiserver" Pod="calico-apiserver-779b7ffd49-78hkk" WorkloadEndpoint="ci--4547--1--0--2--32611d5cc2-k8s-calico--apiserver--779b7ffd49--78hkk-eth0" Jan 23 18:31:26.066039 containerd[1690]: 2026-01-23 18:31:26.045 [INFO][4332] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="da0166878a4856e4f3b80a6347401663103947682caa6efb7785def01a046561" Namespace="calico-apiserver" Pod="calico-apiserver-779b7ffd49-78hkk" WorkloadEndpoint="ci--4547--1--0--2--32611d5cc2-k8s-calico--apiserver--779b7ffd49--78hkk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--1--0--2--32611d5cc2-k8s-calico--apiserver--779b7ffd49--78hkk-eth0", GenerateName:"calico-apiserver-779b7ffd49-", Namespace:"calico-apiserver", SelfLink:"", UID:"644e9aff-fbfa-4d8d-bb83-94a0bb426243", ResourceVersion:"802", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 18, 30, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"779b7ffd49", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-1-0-2-32611d5cc2", ContainerID:"da0166878a4856e4f3b80a6347401663103947682caa6efb7785def01a046561", Pod:"calico-apiserver-779b7ffd49-78hkk", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.28.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali76f9233f2ba", MAC:"32:88:f6:85:f1:ed", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 18:31:26.066089 containerd[1690]: 2026-01-23 18:31:26.056 [INFO][4332] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="da0166878a4856e4f3b80a6347401663103947682caa6efb7785def01a046561" Namespace="calico-apiserver" Pod="calico-apiserver-779b7ffd49-78hkk" WorkloadEndpoint="ci--4547--1--0--2--32611d5cc2-k8s-calico--apiserver--779b7ffd49--78hkk-eth0" Jan 23 18:31:26.071000 audit[4471]: NETFILTER_CFG table=filter:126 family=2 entries=58 op=nft_register_chain pid=4471 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 23 18:31:26.071000 audit[4471]: SYSCALL arch=c000003e syscall=46 success=yes exit=30584 a0=3 a1=7fffce1f56f0 a2=0 a3=7fffce1f56dc items=0 ppid=4131 pid=4471 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:26.071000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 23 18:31:26.159917 containerd[1690]: time="2026-01-23T18:31:26.159792206Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-9xqc6,Uid:dad22ee7-a9d6-4858-9e53-0db48fecba12,Namespace:calico-system,Attempt:0,} returns sandbox id \"b4264068514b98efa75fc93d451111505ee62f588bbf4dfd6eb791396c938d65\"" Jan 23 18:31:26.183677 containerd[1690]: time="2026-01-23T18:31:26.183604110Z" level=info msg="connecting to shim da0166878a4856e4f3b80a6347401663103947682caa6efb7785def01a046561" address="unix:///run/containerd/s/6258de134510e219a5adf6c49439ffda9359affe864298b0b0e22cc10fe73561" namespace=k8s.io protocol=ttrpc version=3 Jan 23 18:31:26.211446 systemd[1]: Started cri-containerd-da0166878a4856e4f3b80a6347401663103947682caa6efb7785def01a046561.scope - libcontainer container da0166878a4856e4f3b80a6347401663103947682caa6efb7785def01a046561. Jan 23 18:31:26.220000 audit: BPF prog-id=221 op=LOAD Jan 23 18:31:26.221000 audit: BPF prog-id=222 op=LOAD Jan 23 18:31:26.221000 audit[4492]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=4481 pid=4492 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:26.221000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6461303136363837386134383536653466336238306136333437343031 Jan 23 18:31:26.221000 audit: BPF prog-id=222 op=UNLOAD Jan 23 18:31:26.221000 audit[4492]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4481 pid=4492 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:26.221000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6461303136363837386134383536653466336238306136333437343031 Jan 23 18:31:26.221000 audit: BPF prog-id=223 op=LOAD Jan 23 18:31:26.221000 audit[4492]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=4481 pid=4492 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:26.221000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6461303136363837386134383536653466336238306136333437343031 Jan 23 18:31:26.221000 audit: BPF prog-id=224 op=LOAD Jan 23 18:31:26.221000 audit[4492]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=4481 pid=4492 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:26.221000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6461303136363837386134383536653466336238306136333437343031 Jan 23 18:31:26.221000 audit: BPF prog-id=224 op=UNLOAD Jan 23 18:31:26.221000 audit[4492]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4481 pid=4492 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:26.221000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6461303136363837386134383536653466336238306136333437343031 Jan 23 18:31:26.221000 audit: BPF prog-id=223 op=UNLOAD Jan 23 18:31:26.221000 audit[4492]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4481 pid=4492 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:26.221000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6461303136363837386134383536653466336238306136333437343031 Jan 23 18:31:26.221000 audit: BPF prog-id=225 op=LOAD Jan 23 18:31:26.221000 audit[4492]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=4481 pid=4492 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:26.221000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6461303136363837386134383536653466336238306136333437343031 Jan 23 18:31:26.257731 containerd[1690]: time="2026-01-23T18:31:26.257688337Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-779b7ffd49-78hkk,Uid:644e9aff-fbfa-4d8d-bb83-94a0bb426243,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"da0166878a4856e4f3b80a6347401663103947682caa6efb7785def01a046561\"" Jan 23 18:31:26.487579 containerd[1690]: time="2026-01-23T18:31:26.487377937Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 18:31:26.488966 containerd[1690]: time="2026-01-23T18:31:26.488881246Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 23 18:31:26.488966 containerd[1690]: time="2026-01-23T18:31:26.488933816Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 23 18:31:26.489191 kubelet[2894]: E0123 18:31:26.489165 2894 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 23 18:31:26.490007 kubelet[2894]: E0123 18:31:26.489495 2894 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 23 18:31:26.490007 kubelet[2894]: E0123 18:31:26.489691 2894 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cn2n5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-8447c9595f-2lbtc_calico-system(be525a6b-06ca-4032-a777-c6e0f1c5eb71): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 23 18:31:26.490603 containerd[1690]: time="2026-01-23T18:31:26.490568349Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 23 18:31:26.490937 kubelet[2894]: E0123 18:31:26.490837 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-8447c9595f-2lbtc" podUID="be525a6b-06ca-4032-a777-c6e0f1c5eb71" Jan 23 18:31:26.597928 systemd-networkd[1583]: vxlan.calico: Gained IPv6LL Jan 23 18:31:26.598152 systemd-networkd[1583]: cali4e656fde26b: Gained IPv6LL Jan 23 18:31:26.770213 kubelet[2894]: E0123 18:31:26.770061 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-8447c9595f-2lbtc" podUID="be525a6b-06ca-4032-a777-c6e0f1c5eb71" Jan 23 18:31:26.843122 containerd[1690]: time="2026-01-23T18:31:26.843062374Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 18:31:26.844405 containerd[1690]: time="2026-01-23T18:31:26.844358565Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 23 18:31:26.844504 containerd[1690]: time="2026-01-23T18:31:26.844447336Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 23 18:31:26.844772 kubelet[2894]: E0123 18:31:26.844711 2894 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 23 18:31:26.844902 kubelet[2894]: E0123 18:31:26.844765 2894 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 23 18:31:26.845298 kubelet[2894]: E0123 18:31:26.845099 2894 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mzml4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-9xqc6_calico-system(dad22ee7-a9d6-4858-9e53-0db48fecba12): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 23 18:31:26.845447 containerd[1690]: time="2026-01-23T18:31:26.845237252Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 23 18:31:26.847078 kubelet[2894]: E0123 18:31:26.847023 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-9xqc6" podUID="dad22ee7-a9d6-4858-9e53-0db48fecba12" Jan 23 18:31:27.319768 containerd[1690]: time="2026-01-23T18:31:27.319698104Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 18:31:27.320831 containerd[1690]: time="2026-01-23T18:31:27.320794134Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 23 18:31:27.321048 containerd[1690]: time="2026-01-23T18:31:27.320906859Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 23 18:31:27.321128 kubelet[2894]: E0123 18:31:27.321089 2894 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 18:31:27.321182 kubelet[2894]: E0123 18:31:27.321140 2894 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 18:31:27.321494 kubelet[2894]: E0123 18:31:27.321297 2894 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cnptr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-779b7ffd49-78hkk_calico-apiserver(644e9aff-fbfa-4d8d-bb83-94a0bb426243): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 23 18:31:27.323143 kubelet[2894]: E0123 18:31:27.323097 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-779b7ffd49-78hkk" podUID="644e9aff-fbfa-4d8d-bb83-94a0bb426243" Jan 23 18:31:27.377285 containerd[1690]: time="2026-01-23T18:31:27.377206578Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-cmmgk,Uid:4b639c85-fa00-406e-8d74-df95ab4cd9fb,Namespace:kube-system,Attempt:0,}" Jan 23 18:31:27.377285 containerd[1690]: time="2026-01-23T18:31:27.377240448Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-j9dvq,Uid:e681e1b7-9935-4d75-8509-9acd7616e3d8,Namespace:calico-system,Attempt:0,}" Jan 23 18:31:27.514359 systemd-networkd[1583]: cali911b1ab5997: Link UP Jan 23 18:31:27.515880 systemd-networkd[1583]: cali911b1ab5997: Gained carrier Jan 23 18:31:27.533170 containerd[1690]: 2026-01-23 18:31:27.424 [INFO][4518] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--1--0--2--32611d5cc2-k8s-coredns--668d6bf9bc--cmmgk-eth0 coredns-668d6bf9bc- kube-system 4b639c85-fa00-406e-8d74-df95ab4cd9fb 812 0 2026-01-23 18:30:45 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4547-1-0-2-32611d5cc2 coredns-668d6bf9bc-cmmgk eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali911b1ab5997 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="44f74ad53d636ba660705dfca69b2bf75803853baacda6b6dcd81147b7f31aab" Namespace="kube-system" Pod="coredns-668d6bf9bc-cmmgk" WorkloadEndpoint="ci--4547--1--0--2--32611d5cc2-k8s-coredns--668d6bf9bc--cmmgk-" Jan 23 18:31:27.533170 containerd[1690]: 2026-01-23 18:31:27.424 [INFO][4518] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="44f74ad53d636ba660705dfca69b2bf75803853baacda6b6dcd81147b7f31aab" Namespace="kube-system" Pod="coredns-668d6bf9bc-cmmgk" WorkloadEndpoint="ci--4547--1--0--2--32611d5cc2-k8s-coredns--668d6bf9bc--cmmgk-eth0" Jan 23 18:31:27.533170 containerd[1690]: 2026-01-23 18:31:27.458 [INFO][4542] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="44f74ad53d636ba660705dfca69b2bf75803853baacda6b6dcd81147b7f31aab" HandleID="k8s-pod-network.44f74ad53d636ba660705dfca69b2bf75803853baacda6b6dcd81147b7f31aab" Workload="ci--4547--1--0--2--32611d5cc2-k8s-coredns--668d6bf9bc--cmmgk-eth0" Jan 23 18:31:27.533422 containerd[1690]: 2026-01-23 18:31:27.462 [INFO][4542] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="44f74ad53d636ba660705dfca69b2bf75803853baacda6b6dcd81147b7f31aab" HandleID="k8s-pod-network.44f74ad53d636ba660705dfca69b2bf75803853baacda6b6dcd81147b7f31aab" Workload="ci--4547--1--0--2--32611d5cc2-k8s-coredns--668d6bf9bc--cmmgk-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002c7870), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4547-1-0-2-32611d5cc2", "pod":"coredns-668d6bf9bc-cmmgk", "timestamp":"2026-01-23 18:31:27.458626547 +0000 UTC"}, Hostname:"ci-4547-1-0-2-32611d5cc2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 23 18:31:27.533422 containerd[1690]: 2026-01-23 18:31:27.462 [INFO][4542] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 23 18:31:27.533422 containerd[1690]: 2026-01-23 18:31:27.462 [INFO][4542] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 23 18:31:27.533422 containerd[1690]: 2026-01-23 18:31:27.463 [INFO][4542] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-1-0-2-32611d5cc2' Jan 23 18:31:27.533422 containerd[1690]: 2026-01-23 18:31:27.476 [INFO][4542] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.44f74ad53d636ba660705dfca69b2bf75803853baacda6b6dcd81147b7f31aab" host="ci-4547-1-0-2-32611d5cc2" Jan 23 18:31:27.533422 containerd[1690]: 2026-01-23 18:31:27.481 [INFO][4542] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-1-0-2-32611d5cc2" Jan 23 18:31:27.533422 containerd[1690]: 2026-01-23 18:31:27.486 [INFO][4542] ipam/ipam.go 511: Trying affinity for 192.168.28.128/26 host="ci-4547-1-0-2-32611d5cc2" Jan 23 18:31:27.533422 containerd[1690]: 2026-01-23 18:31:27.488 [INFO][4542] ipam/ipam.go 158: Attempting to load block cidr=192.168.28.128/26 host="ci-4547-1-0-2-32611d5cc2" Jan 23 18:31:27.533422 containerd[1690]: 2026-01-23 18:31:27.490 [INFO][4542] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.28.128/26 host="ci-4547-1-0-2-32611d5cc2" Jan 23 18:31:27.533612 containerd[1690]: 2026-01-23 18:31:27.490 [INFO][4542] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.28.128/26 handle="k8s-pod-network.44f74ad53d636ba660705dfca69b2bf75803853baacda6b6dcd81147b7f31aab" host="ci-4547-1-0-2-32611d5cc2" Jan 23 18:31:27.533612 containerd[1690]: 2026-01-23 18:31:27.491 [INFO][4542] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.44f74ad53d636ba660705dfca69b2bf75803853baacda6b6dcd81147b7f31aab Jan 23 18:31:27.533612 containerd[1690]: 2026-01-23 18:31:27.495 [INFO][4542] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.28.128/26 handle="k8s-pod-network.44f74ad53d636ba660705dfca69b2bf75803853baacda6b6dcd81147b7f31aab" host="ci-4547-1-0-2-32611d5cc2" Jan 23 18:31:27.533612 containerd[1690]: 2026-01-23 18:31:27.503 [INFO][4542] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.28.133/26] block=192.168.28.128/26 handle="k8s-pod-network.44f74ad53d636ba660705dfca69b2bf75803853baacda6b6dcd81147b7f31aab" host="ci-4547-1-0-2-32611d5cc2" Jan 23 18:31:27.533612 containerd[1690]: 2026-01-23 18:31:27.503 [INFO][4542] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.28.133/26] handle="k8s-pod-network.44f74ad53d636ba660705dfca69b2bf75803853baacda6b6dcd81147b7f31aab" host="ci-4547-1-0-2-32611d5cc2" Jan 23 18:31:27.533612 containerd[1690]: 2026-01-23 18:31:27.503 [INFO][4542] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 23 18:31:27.533612 containerd[1690]: 2026-01-23 18:31:27.503 [INFO][4542] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.28.133/26] IPv6=[] ContainerID="44f74ad53d636ba660705dfca69b2bf75803853baacda6b6dcd81147b7f31aab" HandleID="k8s-pod-network.44f74ad53d636ba660705dfca69b2bf75803853baacda6b6dcd81147b7f31aab" Workload="ci--4547--1--0--2--32611d5cc2-k8s-coredns--668d6bf9bc--cmmgk-eth0" Jan 23 18:31:27.533740 containerd[1690]: 2026-01-23 18:31:27.506 [INFO][4518] cni-plugin/k8s.go 418: Populated endpoint ContainerID="44f74ad53d636ba660705dfca69b2bf75803853baacda6b6dcd81147b7f31aab" Namespace="kube-system" Pod="coredns-668d6bf9bc-cmmgk" WorkloadEndpoint="ci--4547--1--0--2--32611d5cc2-k8s-coredns--668d6bf9bc--cmmgk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--1--0--2--32611d5cc2-k8s-coredns--668d6bf9bc--cmmgk-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"4b639c85-fa00-406e-8d74-df95ab4cd9fb", ResourceVersion:"812", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 18, 30, 45, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-1-0-2-32611d5cc2", ContainerID:"", Pod:"coredns-668d6bf9bc-cmmgk", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.28.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali911b1ab5997", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 18:31:27.533740 containerd[1690]: 2026-01-23 18:31:27.506 [INFO][4518] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.28.133/32] ContainerID="44f74ad53d636ba660705dfca69b2bf75803853baacda6b6dcd81147b7f31aab" Namespace="kube-system" Pod="coredns-668d6bf9bc-cmmgk" WorkloadEndpoint="ci--4547--1--0--2--32611d5cc2-k8s-coredns--668d6bf9bc--cmmgk-eth0" Jan 23 18:31:27.533740 containerd[1690]: 2026-01-23 18:31:27.506 [INFO][4518] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali911b1ab5997 ContainerID="44f74ad53d636ba660705dfca69b2bf75803853baacda6b6dcd81147b7f31aab" Namespace="kube-system" Pod="coredns-668d6bf9bc-cmmgk" WorkloadEndpoint="ci--4547--1--0--2--32611d5cc2-k8s-coredns--668d6bf9bc--cmmgk-eth0" Jan 23 18:31:27.533740 containerd[1690]: 2026-01-23 18:31:27.516 [INFO][4518] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="44f74ad53d636ba660705dfca69b2bf75803853baacda6b6dcd81147b7f31aab" Namespace="kube-system" Pod="coredns-668d6bf9bc-cmmgk" WorkloadEndpoint="ci--4547--1--0--2--32611d5cc2-k8s-coredns--668d6bf9bc--cmmgk-eth0" Jan 23 18:31:27.533740 containerd[1690]: 2026-01-23 18:31:27.516 [INFO][4518] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="44f74ad53d636ba660705dfca69b2bf75803853baacda6b6dcd81147b7f31aab" Namespace="kube-system" Pod="coredns-668d6bf9bc-cmmgk" WorkloadEndpoint="ci--4547--1--0--2--32611d5cc2-k8s-coredns--668d6bf9bc--cmmgk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--1--0--2--32611d5cc2-k8s-coredns--668d6bf9bc--cmmgk-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"4b639c85-fa00-406e-8d74-df95ab4cd9fb", ResourceVersion:"812", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 18, 30, 45, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-1-0-2-32611d5cc2", ContainerID:"44f74ad53d636ba660705dfca69b2bf75803853baacda6b6dcd81147b7f31aab", Pod:"coredns-668d6bf9bc-cmmgk", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.28.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali911b1ab5997", MAC:"16:46:47:25:6a:b7", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 18:31:27.533740 containerd[1690]: 2026-01-23 18:31:27.529 [INFO][4518] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="44f74ad53d636ba660705dfca69b2bf75803853baacda6b6dcd81147b7f31aab" Namespace="kube-system" Pod="coredns-668d6bf9bc-cmmgk" WorkloadEndpoint="ci--4547--1--0--2--32611d5cc2-k8s-coredns--668d6bf9bc--cmmgk-eth0" Jan 23 18:31:27.557617 systemd-networkd[1583]: cali2decfde2beb: Gained IPv6LL Jan 23 18:31:27.558000 audit[4566]: NETFILTER_CFG table=filter:127 family=2 entries=54 op=nft_register_chain pid=4566 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 23 18:31:27.561357 kernel: kauditd_printk_skb: 303 callbacks suppressed Jan 23 18:31:27.561403 kernel: audit: type=1325 audit(1769193087.558:684): table=filter:127 family=2 entries=54 op=nft_register_chain pid=4566 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 23 18:31:27.561426 kernel: audit: type=1300 audit(1769193087.558:684): arch=c000003e syscall=46 success=yes exit=26116 a0=3 a1=7ffe32bc0d90 a2=0 a3=7ffe32bc0d7c items=0 ppid=4131 pid=4566 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:27.561453 kernel: audit: type=1327 audit(1769193087.558:684): proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 23 18:31:27.558000 audit[4566]: SYSCALL arch=c000003e syscall=46 success=yes exit=26116 a0=3 a1=7ffe32bc0d90 a2=0 a3=7ffe32bc0d7c items=0 ppid=4131 pid=4566 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:27.558000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 23 18:31:27.575586 containerd[1690]: time="2026-01-23T18:31:27.574793308Z" level=info msg="connecting to shim 44f74ad53d636ba660705dfca69b2bf75803853baacda6b6dcd81147b7f31aab" address="unix:///run/containerd/s/e7a37b0128b5769b53c76a76bfcbab4e533e465117a45afa1a2011191026404b" namespace=k8s.io protocol=ttrpc version=3 Jan 23 18:31:27.613477 systemd[1]: Started cri-containerd-44f74ad53d636ba660705dfca69b2bf75803853baacda6b6dcd81147b7f31aab.scope - libcontainer container 44f74ad53d636ba660705dfca69b2bf75803853baacda6b6dcd81147b7f31aab. Jan 23 18:31:27.628000 audit: BPF prog-id=226 op=LOAD Jan 23 18:31:27.630669 systemd-networkd[1583]: calia510a9d4a53: Link UP Jan 23 18:31:27.631285 kernel: audit: type=1334 audit(1769193087.628:685): prog-id=226 op=LOAD Jan 23 18:31:27.629000 audit: BPF prog-id=227 op=LOAD Jan 23 18:31:27.632855 systemd-networkd[1583]: calia510a9d4a53: Gained carrier Jan 23 18:31:27.633290 kernel: audit: type=1334 audit(1769193087.629:686): prog-id=227 op=LOAD Jan 23 18:31:27.629000 audit[4585]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b0238 a2=98 a3=0 items=0 ppid=4575 pid=4585 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:27.637498 kernel: audit: type=1300 audit(1769193087.629:686): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b0238 a2=98 a3=0 items=0 ppid=4575 pid=4585 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:27.629000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3434663734616435336436333662613636303730356466636136396232 Jan 23 18:31:27.641432 kernel: audit: type=1327 audit(1769193087.629:686): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3434663734616435336436333662613636303730356466636136396232 Jan 23 18:31:27.629000 audit: BPF prog-id=227 op=UNLOAD Jan 23 18:31:27.643282 kernel: audit: type=1334 audit(1769193087.629:687): prog-id=227 op=UNLOAD Jan 23 18:31:27.629000 audit[4585]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4575 pid=4585 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:27.629000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3434663734616435336436333662613636303730356466636136396232 Jan 23 18:31:27.649008 kernel: audit: type=1300 audit(1769193087.629:687): arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4575 pid=4585 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:27.649053 kernel: audit: type=1327 audit(1769193087.629:687): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3434663734616435336436333662613636303730356466636136396232 Jan 23 18:31:27.631000 audit: BPF prog-id=228 op=LOAD Jan 23 18:31:27.631000 audit[4585]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b0488 a2=98 a3=0 items=0 ppid=4575 pid=4585 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:27.631000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3434663734616435336436333662613636303730356466636136396232 Jan 23 18:31:27.631000 audit: BPF prog-id=229 op=LOAD Jan 23 18:31:27.631000 audit[4585]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c0001b0218 a2=98 a3=0 items=0 ppid=4575 pid=4585 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:27.631000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3434663734616435336436333662613636303730356466636136396232 Jan 23 18:31:27.631000 audit: BPF prog-id=229 op=UNLOAD Jan 23 18:31:27.631000 audit[4585]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4575 pid=4585 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:27.631000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3434663734616435336436333662613636303730356466636136396232 Jan 23 18:31:27.631000 audit: BPF prog-id=228 op=UNLOAD Jan 23 18:31:27.631000 audit[4585]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4575 pid=4585 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:27.631000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3434663734616435336436333662613636303730356466636136396232 Jan 23 18:31:27.631000 audit: BPF prog-id=230 op=LOAD Jan 23 18:31:27.631000 audit[4585]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b06e8 a2=98 a3=0 items=0 ppid=4575 pid=4585 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:27.631000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3434663734616435336436333662613636303730356466636136396232 Jan 23 18:31:27.657822 containerd[1690]: 2026-01-23 18:31:27.433 [INFO][4526] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--1--0--2--32611d5cc2-k8s-csi--node--driver--j9dvq-eth0 csi-node-driver- calico-system e681e1b7-9935-4d75-8509-9acd7616e3d8 689 0 2026-01-23 18:30:59 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:857b56db8f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4547-1-0-2-32611d5cc2 csi-node-driver-j9dvq eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calia510a9d4a53 [] [] }} ContainerID="75fa801e19f08b659fcf5f14bfeec20284172816e90052c813d07fec29d36c89" Namespace="calico-system" Pod="csi-node-driver-j9dvq" WorkloadEndpoint="ci--4547--1--0--2--32611d5cc2-k8s-csi--node--driver--j9dvq-" Jan 23 18:31:27.657822 containerd[1690]: 2026-01-23 18:31:27.433 [INFO][4526] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="75fa801e19f08b659fcf5f14bfeec20284172816e90052c813d07fec29d36c89" Namespace="calico-system" Pod="csi-node-driver-j9dvq" WorkloadEndpoint="ci--4547--1--0--2--32611d5cc2-k8s-csi--node--driver--j9dvq-eth0" Jan 23 18:31:27.657822 containerd[1690]: 2026-01-23 18:31:27.478 [INFO][4548] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="75fa801e19f08b659fcf5f14bfeec20284172816e90052c813d07fec29d36c89" HandleID="k8s-pod-network.75fa801e19f08b659fcf5f14bfeec20284172816e90052c813d07fec29d36c89" Workload="ci--4547--1--0--2--32611d5cc2-k8s-csi--node--driver--j9dvq-eth0" Jan 23 18:31:27.657822 containerd[1690]: 2026-01-23 18:31:27.478 [INFO][4548] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="75fa801e19f08b659fcf5f14bfeec20284172816e90052c813d07fec29d36c89" HandleID="k8s-pod-network.75fa801e19f08b659fcf5f14bfeec20284172816e90052c813d07fec29d36c89" Workload="ci--4547--1--0--2--32611d5cc2-k8s-csi--node--driver--j9dvq-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d4fa0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4547-1-0-2-32611d5cc2", "pod":"csi-node-driver-j9dvq", "timestamp":"2026-01-23 18:31:27.478708438 +0000 UTC"}, Hostname:"ci-4547-1-0-2-32611d5cc2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 23 18:31:27.657822 containerd[1690]: 2026-01-23 18:31:27.478 [INFO][4548] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 23 18:31:27.657822 containerd[1690]: 2026-01-23 18:31:27.503 [INFO][4548] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 23 18:31:27.657822 containerd[1690]: 2026-01-23 18:31:27.505 [INFO][4548] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-1-0-2-32611d5cc2' Jan 23 18:31:27.657822 containerd[1690]: 2026-01-23 18:31:27.576 [INFO][4548] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.75fa801e19f08b659fcf5f14bfeec20284172816e90052c813d07fec29d36c89" host="ci-4547-1-0-2-32611d5cc2" Jan 23 18:31:27.657822 containerd[1690]: 2026-01-23 18:31:27.588 [INFO][4548] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-1-0-2-32611d5cc2" Jan 23 18:31:27.657822 containerd[1690]: 2026-01-23 18:31:27.595 [INFO][4548] ipam/ipam.go 511: Trying affinity for 192.168.28.128/26 host="ci-4547-1-0-2-32611d5cc2" Jan 23 18:31:27.657822 containerd[1690]: 2026-01-23 18:31:27.598 [INFO][4548] ipam/ipam.go 158: Attempting to load block cidr=192.168.28.128/26 host="ci-4547-1-0-2-32611d5cc2" Jan 23 18:31:27.657822 containerd[1690]: 2026-01-23 18:31:27.601 [INFO][4548] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.28.128/26 host="ci-4547-1-0-2-32611d5cc2" Jan 23 18:31:27.657822 containerd[1690]: 2026-01-23 18:31:27.601 [INFO][4548] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.28.128/26 handle="k8s-pod-network.75fa801e19f08b659fcf5f14bfeec20284172816e90052c813d07fec29d36c89" host="ci-4547-1-0-2-32611d5cc2" Jan 23 18:31:27.657822 containerd[1690]: 2026-01-23 18:31:27.603 [INFO][4548] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.75fa801e19f08b659fcf5f14bfeec20284172816e90052c813d07fec29d36c89 Jan 23 18:31:27.657822 containerd[1690]: 2026-01-23 18:31:27.609 [INFO][4548] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.28.128/26 handle="k8s-pod-network.75fa801e19f08b659fcf5f14bfeec20284172816e90052c813d07fec29d36c89" host="ci-4547-1-0-2-32611d5cc2" Jan 23 18:31:27.657822 containerd[1690]: 2026-01-23 18:31:27.623 [INFO][4548] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.28.134/26] block=192.168.28.128/26 handle="k8s-pod-network.75fa801e19f08b659fcf5f14bfeec20284172816e90052c813d07fec29d36c89" host="ci-4547-1-0-2-32611d5cc2" Jan 23 18:31:27.657822 containerd[1690]: 2026-01-23 18:31:27.623 [INFO][4548] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.28.134/26] handle="k8s-pod-network.75fa801e19f08b659fcf5f14bfeec20284172816e90052c813d07fec29d36c89" host="ci-4547-1-0-2-32611d5cc2" Jan 23 18:31:27.657822 containerd[1690]: 2026-01-23 18:31:27.623 [INFO][4548] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 23 18:31:27.657822 containerd[1690]: 2026-01-23 18:31:27.623 [INFO][4548] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.28.134/26] IPv6=[] ContainerID="75fa801e19f08b659fcf5f14bfeec20284172816e90052c813d07fec29d36c89" HandleID="k8s-pod-network.75fa801e19f08b659fcf5f14bfeec20284172816e90052c813d07fec29d36c89" Workload="ci--4547--1--0--2--32611d5cc2-k8s-csi--node--driver--j9dvq-eth0" Jan 23 18:31:27.659625 containerd[1690]: 2026-01-23 18:31:27.625 [INFO][4526] cni-plugin/k8s.go 418: Populated endpoint ContainerID="75fa801e19f08b659fcf5f14bfeec20284172816e90052c813d07fec29d36c89" Namespace="calico-system" Pod="csi-node-driver-j9dvq" WorkloadEndpoint="ci--4547--1--0--2--32611d5cc2-k8s-csi--node--driver--j9dvq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--1--0--2--32611d5cc2-k8s-csi--node--driver--j9dvq-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"e681e1b7-9935-4d75-8509-9acd7616e3d8", ResourceVersion:"689", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 18, 30, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-1-0-2-32611d5cc2", ContainerID:"", Pod:"csi-node-driver-j9dvq", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.28.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calia510a9d4a53", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 18:31:27.659625 containerd[1690]: 2026-01-23 18:31:27.625 [INFO][4526] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.28.134/32] ContainerID="75fa801e19f08b659fcf5f14bfeec20284172816e90052c813d07fec29d36c89" Namespace="calico-system" Pod="csi-node-driver-j9dvq" WorkloadEndpoint="ci--4547--1--0--2--32611d5cc2-k8s-csi--node--driver--j9dvq-eth0" Jan 23 18:31:27.659625 containerd[1690]: 2026-01-23 18:31:27.626 [INFO][4526] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia510a9d4a53 ContainerID="75fa801e19f08b659fcf5f14bfeec20284172816e90052c813d07fec29d36c89" Namespace="calico-system" Pod="csi-node-driver-j9dvq" WorkloadEndpoint="ci--4547--1--0--2--32611d5cc2-k8s-csi--node--driver--j9dvq-eth0" Jan 23 18:31:27.659625 containerd[1690]: 2026-01-23 18:31:27.632 [INFO][4526] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="75fa801e19f08b659fcf5f14bfeec20284172816e90052c813d07fec29d36c89" Namespace="calico-system" Pod="csi-node-driver-j9dvq" WorkloadEndpoint="ci--4547--1--0--2--32611d5cc2-k8s-csi--node--driver--j9dvq-eth0" Jan 23 18:31:27.659625 containerd[1690]: 2026-01-23 18:31:27.637 [INFO][4526] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="75fa801e19f08b659fcf5f14bfeec20284172816e90052c813d07fec29d36c89" Namespace="calico-system" Pod="csi-node-driver-j9dvq" WorkloadEndpoint="ci--4547--1--0--2--32611d5cc2-k8s-csi--node--driver--j9dvq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--1--0--2--32611d5cc2-k8s-csi--node--driver--j9dvq-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"e681e1b7-9935-4d75-8509-9acd7616e3d8", ResourceVersion:"689", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 18, 30, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-1-0-2-32611d5cc2", ContainerID:"75fa801e19f08b659fcf5f14bfeec20284172816e90052c813d07fec29d36c89", Pod:"csi-node-driver-j9dvq", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.28.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calia510a9d4a53", MAC:"86:14:0c:75:4c:4a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 18:31:27.659625 containerd[1690]: 2026-01-23 18:31:27.654 [INFO][4526] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="75fa801e19f08b659fcf5f14bfeec20284172816e90052c813d07fec29d36c89" Namespace="calico-system" Pod="csi-node-driver-j9dvq" WorkloadEndpoint="ci--4547--1--0--2--32611d5cc2-k8s-csi--node--driver--j9dvq-eth0" Jan 23 18:31:27.670000 audit[4611]: NETFILTER_CFG table=filter:128 family=2 entries=58 op=nft_register_chain pid=4611 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 23 18:31:27.670000 audit[4611]: SYSCALL arch=c000003e syscall=46 success=yes exit=27180 a0=3 a1=7ffd81e39380 a2=0 a3=7ffd81e3936c items=0 ppid=4131 pid=4611 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:27.670000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 23 18:31:27.701103 containerd[1690]: time="2026-01-23T18:31:27.700936290Z" level=info msg="connecting to shim 75fa801e19f08b659fcf5f14bfeec20284172816e90052c813d07fec29d36c89" address="unix:///run/containerd/s/ded1cb5f76d3bb1115f11b6189bbfaa0e0fbfea6d0a71216ae1273c00bc53bbf" namespace=k8s.io protocol=ttrpc version=3 Jan 23 18:31:27.704515 containerd[1690]: time="2026-01-23T18:31:27.704416375Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-cmmgk,Uid:4b639c85-fa00-406e-8d74-df95ab4cd9fb,Namespace:kube-system,Attempt:0,} returns sandbox id \"44f74ad53d636ba660705dfca69b2bf75803853baacda6b6dcd81147b7f31aab\"" Jan 23 18:31:27.709113 containerd[1690]: time="2026-01-23T18:31:27.709082575Z" level=info msg="CreateContainer within sandbox \"44f74ad53d636ba660705dfca69b2bf75803853baacda6b6dcd81147b7f31aab\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 23 18:31:27.721310 containerd[1690]: time="2026-01-23T18:31:27.721270269Z" level=info msg="Container bb615ca3fa05e778f1daa08285b16dbdc45f2de6c7e4440f4588150446b87ce7: CDI devices from CRI Config.CDIDevices: []" Jan 23 18:31:27.728476 systemd[1]: Started cri-containerd-75fa801e19f08b659fcf5f14bfeec20284172816e90052c813d07fec29d36c89.scope - libcontainer container 75fa801e19f08b659fcf5f14bfeec20284172816e90052c813d07fec29d36c89. Jan 23 18:31:27.736583 containerd[1690]: time="2026-01-23T18:31:27.736555161Z" level=info msg="CreateContainer within sandbox \"44f74ad53d636ba660705dfca69b2bf75803853baacda6b6dcd81147b7f31aab\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"bb615ca3fa05e778f1daa08285b16dbdc45f2de6c7e4440f4588150446b87ce7\"" Jan 23 18:31:27.737228 containerd[1690]: time="2026-01-23T18:31:27.737194105Z" level=info msg="StartContainer for \"bb615ca3fa05e778f1daa08285b16dbdc45f2de6c7e4440f4588150446b87ce7\"" Jan 23 18:31:27.738710 containerd[1690]: time="2026-01-23T18:31:27.738685298Z" level=info msg="connecting to shim bb615ca3fa05e778f1daa08285b16dbdc45f2de6c7e4440f4588150446b87ce7" address="unix:///run/containerd/s/e7a37b0128b5769b53c76a76bfcbab4e533e465117a45afa1a2011191026404b" protocol=ttrpc version=3 Jan 23 18:31:27.746000 audit: BPF prog-id=231 op=LOAD Jan 23 18:31:27.747000 audit: BPF prog-id=232 op=LOAD Jan 23 18:31:27.747000 audit[4638]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=4627 pid=4638 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:27.747000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3735666138303165313966303862363539666366356631346266656563 Jan 23 18:31:27.747000 audit: BPF prog-id=232 op=UNLOAD Jan 23 18:31:27.747000 audit[4638]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4627 pid=4638 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:27.747000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3735666138303165313966303862363539666366356631346266656563 Jan 23 18:31:27.748000 audit: BPF prog-id=233 op=LOAD Jan 23 18:31:27.748000 audit[4638]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=4627 pid=4638 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:27.748000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3735666138303165313966303862363539666366356631346266656563 Jan 23 18:31:27.748000 audit: BPF prog-id=234 op=LOAD Jan 23 18:31:27.748000 audit[4638]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=4627 pid=4638 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:27.748000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3735666138303165313966303862363539666366356631346266656563 Jan 23 18:31:27.748000 audit: BPF prog-id=234 op=UNLOAD Jan 23 18:31:27.748000 audit[4638]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4627 pid=4638 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:27.748000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3735666138303165313966303862363539666366356631346266656563 Jan 23 18:31:27.748000 audit: BPF prog-id=233 op=UNLOAD Jan 23 18:31:27.748000 audit[4638]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4627 pid=4638 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:27.748000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3735666138303165313966303862363539666366356631346266656563 Jan 23 18:31:27.748000 audit: BPF prog-id=235 op=LOAD Jan 23 18:31:27.748000 audit[4638]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=4627 pid=4638 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:27.748000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3735666138303165313966303862363539666366356631346266656563 Jan 23 18:31:27.750886 systemd-networkd[1583]: cali76f9233f2ba: Gained IPv6LL Jan 23 18:31:27.767466 systemd[1]: Started cri-containerd-bb615ca3fa05e778f1daa08285b16dbdc45f2de6c7e4440f4588150446b87ce7.scope - libcontainer container bb615ca3fa05e778f1daa08285b16dbdc45f2de6c7e4440f4588150446b87ce7. Jan 23 18:31:27.771563 containerd[1690]: time="2026-01-23T18:31:27.771525649Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-j9dvq,Uid:e681e1b7-9935-4d75-8509-9acd7616e3d8,Namespace:calico-system,Attempt:0,} returns sandbox id \"75fa801e19f08b659fcf5f14bfeec20284172816e90052c813d07fec29d36c89\"" Jan 23 18:31:27.775599 containerd[1690]: time="2026-01-23T18:31:27.775113953Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 23 18:31:27.775689 kubelet[2894]: E0123 18:31:27.775339 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-779b7ffd49-78hkk" podUID="644e9aff-fbfa-4d8d-bb83-94a0bb426243" Jan 23 18:31:27.775689 kubelet[2894]: E0123 18:31:27.775562 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-8447c9595f-2lbtc" podUID="be525a6b-06ca-4032-a777-c6e0f1c5eb71" Jan 23 18:31:27.776220 kubelet[2894]: E0123 18:31:27.776097 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-9xqc6" podUID="dad22ee7-a9d6-4858-9e53-0db48fecba12" Jan 23 18:31:27.787000 audit: BPF prog-id=236 op=LOAD Jan 23 18:31:27.788000 audit: BPF prog-id=237 op=LOAD Jan 23 18:31:27.788000 audit[4658]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00010c238 a2=98 a3=0 items=0 ppid=4575 pid=4658 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:27.788000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6262363135636133666130356537373866316461613038323835623136 Jan 23 18:31:27.789000 audit: BPF prog-id=237 op=UNLOAD Jan 23 18:31:27.789000 audit[4658]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4575 pid=4658 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:27.789000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6262363135636133666130356537373866316461613038323835623136 Jan 23 18:31:27.789000 audit: BPF prog-id=238 op=LOAD Jan 23 18:31:27.789000 audit[4658]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00010c488 a2=98 a3=0 items=0 ppid=4575 pid=4658 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:27.789000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6262363135636133666130356537373866316461613038323835623136 Jan 23 18:31:27.790000 audit: BPF prog-id=239 op=LOAD Jan 23 18:31:27.790000 audit[4658]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00010c218 a2=98 a3=0 items=0 ppid=4575 pid=4658 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:27.790000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6262363135636133666130356537373866316461613038323835623136 Jan 23 18:31:27.790000 audit: BPF prog-id=239 op=UNLOAD Jan 23 18:31:27.790000 audit[4658]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4575 pid=4658 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:27.790000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6262363135636133666130356537373866316461613038323835623136 Jan 23 18:31:27.790000 audit: BPF prog-id=238 op=UNLOAD Jan 23 18:31:27.790000 audit[4658]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4575 pid=4658 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:27.790000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6262363135636133666130356537373866316461613038323835623136 Jan 23 18:31:27.791000 audit: BPF prog-id=240 op=LOAD Jan 23 18:31:27.791000 audit[4658]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00010c6e8 a2=98 a3=0 items=0 ppid=4575 pid=4658 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:27.791000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6262363135636133666130356537373866316461613038323835623136 Jan 23 18:31:27.824139 containerd[1690]: time="2026-01-23T18:31:27.824103152Z" level=info msg="StartContainer for \"bb615ca3fa05e778f1daa08285b16dbdc45f2de6c7e4440f4588150446b87ce7\" returns successfully" Jan 23 18:31:27.833000 audit[4694]: NETFILTER_CFG table=filter:129 family=2 entries=20 op=nft_register_rule pid=4694 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:31:27.833000 audit[4694]: SYSCALL arch=c000003e syscall=46 success=yes exit=7480 a0=3 a1=7ffd5d01e920 a2=0 a3=7ffd5d01e90c items=0 ppid=3032 pid=4694 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:27.833000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:31:27.840000 audit[4694]: NETFILTER_CFG table=nat:130 family=2 entries=14 op=nft_register_rule pid=4694 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:31:27.840000 audit[4694]: SYSCALL arch=c000003e syscall=46 success=yes exit=3468 a0=3 a1=7ffd5d01e920 a2=0 a3=0 items=0 ppid=3032 pid=4694 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:27.840000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:31:27.856000 audit[4696]: NETFILTER_CFG table=filter:131 family=2 entries=20 op=nft_register_rule pid=4696 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:31:27.856000 audit[4696]: SYSCALL arch=c000003e syscall=46 success=yes exit=7480 a0=3 a1=7ffea1761f50 a2=0 a3=7ffea1761f3c items=0 ppid=3032 pid=4696 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:27.856000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:31:27.861000 audit[4696]: NETFILTER_CFG table=nat:132 family=2 entries=14 op=nft_register_rule pid=4696 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:31:27.861000 audit[4696]: SYSCALL arch=c000003e syscall=46 success=yes exit=3468 a0=3 a1=7ffea1761f50 a2=0 a3=0 items=0 ppid=3032 pid=4696 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:27.861000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:31:28.121602 containerd[1690]: time="2026-01-23T18:31:28.121545466Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 18:31:28.123640 containerd[1690]: time="2026-01-23T18:31:28.123339629Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 23 18:31:28.123640 containerd[1690]: time="2026-01-23T18:31:28.123370385Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 23 18:31:28.123922 kubelet[2894]: E0123 18:31:28.123861 2894 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 23 18:31:28.124338 kubelet[2894]: E0123 18:31:28.124304 2894 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 23 18:31:28.124595 kubelet[2894]: E0123 18:31:28.124568 2894 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cprcl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-j9dvq_calico-system(e681e1b7-9935-4d75-8509-9acd7616e3d8): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 23 18:31:28.127566 containerd[1690]: time="2026-01-23T18:31:28.127340985Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 23 18:31:28.377388 containerd[1690]: time="2026-01-23T18:31:28.377243040Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-779b7ffd49-pbrfh,Uid:7a5d89a2-c80e-4f56-8808-99252854603a,Namespace:calico-apiserver,Attempt:0,}" Jan 23 18:31:28.448907 containerd[1690]: time="2026-01-23T18:31:28.448716118Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 18:31:28.449885 containerd[1690]: time="2026-01-23T18:31:28.449834867Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 23 18:31:28.450087 containerd[1690]: time="2026-01-23T18:31:28.449996095Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 23 18:31:28.450399 kubelet[2894]: E0123 18:31:28.450351 2894 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 23 18:31:28.451300 kubelet[2894]: E0123 18:31:28.450416 2894 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 23 18:31:28.451300 kubelet[2894]: E0123 18:31:28.450551 2894 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cprcl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-j9dvq_calico-system(e681e1b7-9935-4d75-8509-9acd7616e3d8): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 23 18:31:28.451724 kubelet[2894]: E0123 18:31:28.451693 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-j9dvq" podUID="e681e1b7-9935-4d75-8509-9acd7616e3d8" Jan 23 18:31:28.491605 systemd-networkd[1583]: cali928040ed5c7: Link UP Jan 23 18:31:28.492570 systemd-networkd[1583]: cali928040ed5c7: Gained carrier Jan 23 18:31:28.513405 containerd[1690]: 2026-01-23 18:31:28.419 [INFO][4704] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--1--0--2--32611d5cc2-k8s-calico--apiserver--779b7ffd49--pbrfh-eth0 calico-apiserver-779b7ffd49- calico-apiserver 7a5d89a2-c80e-4f56-8808-99252854603a 809 0 2026-01-23 18:30:54 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:779b7ffd49 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4547-1-0-2-32611d5cc2 calico-apiserver-779b7ffd49-pbrfh eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali928040ed5c7 [] [] }} ContainerID="f073a0cee2219455715969bbc5f1ea725ee359487a21f8164cf55c6b9b056b0b" Namespace="calico-apiserver" Pod="calico-apiserver-779b7ffd49-pbrfh" WorkloadEndpoint="ci--4547--1--0--2--32611d5cc2-k8s-calico--apiserver--779b7ffd49--pbrfh-" Jan 23 18:31:28.513405 containerd[1690]: 2026-01-23 18:31:28.419 [INFO][4704] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="f073a0cee2219455715969bbc5f1ea725ee359487a21f8164cf55c6b9b056b0b" Namespace="calico-apiserver" Pod="calico-apiserver-779b7ffd49-pbrfh" WorkloadEndpoint="ci--4547--1--0--2--32611d5cc2-k8s-calico--apiserver--779b7ffd49--pbrfh-eth0" Jan 23 18:31:28.513405 containerd[1690]: 2026-01-23 18:31:28.444 [INFO][4716] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f073a0cee2219455715969bbc5f1ea725ee359487a21f8164cf55c6b9b056b0b" HandleID="k8s-pod-network.f073a0cee2219455715969bbc5f1ea725ee359487a21f8164cf55c6b9b056b0b" Workload="ci--4547--1--0--2--32611d5cc2-k8s-calico--apiserver--779b7ffd49--pbrfh-eth0" Jan 23 18:31:28.513405 containerd[1690]: 2026-01-23 18:31:28.444 [INFO][4716] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="f073a0cee2219455715969bbc5f1ea725ee359487a21f8164cf55c6b9b056b0b" HandleID="k8s-pod-network.f073a0cee2219455715969bbc5f1ea725ee359487a21f8164cf55c6b9b056b0b" Workload="ci--4547--1--0--2--32611d5cc2-k8s-calico--apiserver--779b7ffd49--pbrfh-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024efe0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4547-1-0-2-32611d5cc2", "pod":"calico-apiserver-779b7ffd49-pbrfh", "timestamp":"2026-01-23 18:31:28.444717938 +0000 UTC"}, Hostname:"ci-4547-1-0-2-32611d5cc2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 23 18:31:28.513405 containerd[1690]: 2026-01-23 18:31:28.444 [INFO][4716] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 23 18:31:28.513405 containerd[1690]: 2026-01-23 18:31:28.444 [INFO][4716] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 23 18:31:28.513405 containerd[1690]: 2026-01-23 18:31:28.444 [INFO][4716] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-1-0-2-32611d5cc2' Jan 23 18:31:28.513405 containerd[1690]: 2026-01-23 18:31:28.452 [INFO][4716] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.f073a0cee2219455715969bbc5f1ea725ee359487a21f8164cf55c6b9b056b0b" host="ci-4547-1-0-2-32611d5cc2" Jan 23 18:31:28.513405 containerd[1690]: 2026-01-23 18:31:28.457 [INFO][4716] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-1-0-2-32611d5cc2" Jan 23 18:31:28.513405 containerd[1690]: 2026-01-23 18:31:28.463 [INFO][4716] ipam/ipam.go 511: Trying affinity for 192.168.28.128/26 host="ci-4547-1-0-2-32611d5cc2" Jan 23 18:31:28.513405 containerd[1690]: 2026-01-23 18:31:28.465 [INFO][4716] ipam/ipam.go 158: Attempting to load block cidr=192.168.28.128/26 host="ci-4547-1-0-2-32611d5cc2" Jan 23 18:31:28.513405 containerd[1690]: 2026-01-23 18:31:28.467 [INFO][4716] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.28.128/26 host="ci-4547-1-0-2-32611d5cc2" Jan 23 18:31:28.513405 containerd[1690]: 2026-01-23 18:31:28.468 [INFO][4716] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.28.128/26 handle="k8s-pod-network.f073a0cee2219455715969bbc5f1ea725ee359487a21f8164cf55c6b9b056b0b" host="ci-4547-1-0-2-32611d5cc2" Jan 23 18:31:28.513405 containerd[1690]: 2026-01-23 18:31:28.471 [INFO][4716] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.f073a0cee2219455715969bbc5f1ea725ee359487a21f8164cf55c6b9b056b0b Jan 23 18:31:28.513405 containerd[1690]: 2026-01-23 18:31:28.476 [INFO][4716] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.28.128/26 handle="k8s-pod-network.f073a0cee2219455715969bbc5f1ea725ee359487a21f8164cf55c6b9b056b0b" host="ci-4547-1-0-2-32611d5cc2" Jan 23 18:31:28.513405 containerd[1690]: 2026-01-23 18:31:28.484 [INFO][4716] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.28.135/26] block=192.168.28.128/26 handle="k8s-pod-network.f073a0cee2219455715969bbc5f1ea725ee359487a21f8164cf55c6b9b056b0b" host="ci-4547-1-0-2-32611d5cc2" Jan 23 18:31:28.513405 containerd[1690]: 2026-01-23 18:31:28.484 [INFO][4716] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.28.135/26] handle="k8s-pod-network.f073a0cee2219455715969bbc5f1ea725ee359487a21f8164cf55c6b9b056b0b" host="ci-4547-1-0-2-32611d5cc2" Jan 23 18:31:28.513405 containerd[1690]: 2026-01-23 18:31:28.484 [INFO][4716] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 23 18:31:28.513405 containerd[1690]: 2026-01-23 18:31:28.484 [INFO][4716] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.28.135/26] IPv6=[] ContainerID="f073a0cee2219455715969bbc5f1ea725ee359487a21f8164cf55c6b9b056b0b" HandleID="k8s-pod-network.f073a0cee2219455715969bbc5f1ea725ee359487a21f8164cf55c6b9b056b0b" Workload="ci--4547--1--0--2--32611d5cc2-k8s-calico--apiserver--779b7ffd49--pbrfh-eth0" Jan 23 18:31:28.514515 containerd[1690]: 2026-01-23 18:31:28.486 [INFO][4704] cni-plugin/k8s.go 418: Populated endpoint ContainerID="f073a0cee2219455715969bbc5f1ea725ee359487a21f8164cf55c6b9b056b0b" Namespace="calico-apiserver" Pod="calico-apiserver-779b7ffd49-pbrfh" WorkloadEndpoint="ci--4547--1--0--2--32611d5cc2-k8s-calico--apiserver--779b7ffd49--pbrfh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--1--0--2--32611d5cc2-k8s-calico--apiserver--779b7ffd49--pbrfh-eth0", GenerateName:"calico-apiserver-779b7ffd49-", Namespace:"calico-apiserver", SelfLink:"", UID:"7a5d89a2-c80e-4f56-8808-99252854603a", ResourceVersion:"809", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 18, 30, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"779b7ffd49", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-1-0-2-32611d5cc2", ContainerID:"", Pod:"calico-apiserver-779b7ffd49-pbrfh", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.28.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali928040ed5c7", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 18:31:28.514515 containerd[1690]: 2026-01-23 18:31:28.487 [INFO][4704] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.28.135/32] ContainerID="f073a0cee2219455715969bbc5f1ea725ee359487a21f8164cf55c6b9b056b0b" Namespace="calico-apiserver" Pod="calico-apiserver-779b7ffd49-pbrfh" WorkloadEndpoint="ci--4547--1--0--2--32611d5cc2-k8s-calico--apiserver--779b7ffd49--pbrfh-eth0" Jan 23 18:31:28.514515 containerd[1690]: 2026-01-23 18:31:28.487 [INFO][4704] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali928040ed5c7 ContainerID="f073a0cee2219455715969bbc5f1ea725ee359487a21f8164cf55c6b9b056b0b" Namespace="calico-apiserver" Pod="calico-apiserver-779b7ffd49-pbrfh" WorkloadEndpoint="ci--4547--1--0--2--32611d5cc2-k8s-calico--apiserver--779b7ffd49--pbrfh-eth0" Jan 23 18:31:28.514515 containerd[1690]: 2026-01-23 18:31:28.491 [INFO][4704] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f073a0cee2219455715969bbc5f1ea725ee359487a21f8164cf55c6b9b056b0b" Namespace="calico-apiserver" Pod="calico-apiserver-779b7ffd49-pbrfh" WorkloadEndpoint="ci--4547--1--0--2--32611d5cc2-k8s-calico--apiserver--779b7ffd49--pbrfh-eth0" Jan 23 18:31:28.514515 containerd[1690]: 2026-01-23 18:31:28.491 [INFO][4704] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="f073a0cee2219455715969bbc5f1ea725ee359487a21f8164cf55c6b9b056b0b" Namespace="calico-apiserver" Pod="calico-apiserver-779b7ffd49-pbrfh" WorkloadEndpoint="ci--4547--1--0--2--32611d5cc2-k8s-calico--apiserver--779b7ffd49--pbrfh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--1--0--2--32611d5cc2-k8s-calico--apiserver--779b7ffd49--pbrfh-eth0", GenerateName:"calico-apiserver-779b7ffd49-", Namespace:"calico-apiserver", SelfLink:"", UID:"7a5d89a2-c80e-4f56-8808-99252854603a", ResourceVersion:"809", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 18, 30, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"779b7ffd49", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-1-0-2-32611d5cc2", ContainerID:"f073a0cee2219455715969bbc5f1ea725ee359487a21f8164cf55c6b9b056b0b", Pod:"calico-apiserver-779b7ffd49-pbrfh", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.28.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali928040ed5c7", MAC:"7a:ed:cd:03:47:dc", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 18:31:28.514515 containerd[1690]: 2026-01-23 18:31:28.510 [INFO][4704] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="f073a0cee2219455715969bbc5f1ea725ee359487a21f8164cf55c6b9b056b0b" Namespace="calico-apiserver" Pod="calico-apiserver-779b7ffd49-pbrfh" WorkloadEndpoint="ci--4547--1--0--2--32611d5cc2-k8s-calico--apiserver--779b7ffd49--pbrfh-eth0" Jan 23 18:31:28.531000 audit[4730]: NETFILTER_CFG table=filter:133 family=2 entries=59 op=nft_register_chain pid=4730 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 23 18:31:28.531000 audit[4730]: SYSCALL arch=c000003e syscall=46 success=yes exit=29476 a0=3 a1=7ffefd9b37e0 a2=0 a3=7ffefd9b37cc items=0 ppid=4131 pid=4730 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:28.531000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 23 18:31:28.541681 containerd[1690]: time="2026-01-23T18:31:28.541631850Z" level=info msg="connecting to shim f073a0cee2219455715969bbc5f1ea725ee359487a21f8164cf55c6b9b056b0b" address="unix:///run/containerd/s/f86aced671164fbb35d34d4db02df520ac06876f8d1b8927b860e2901ce3765d" namespace=k8s.io protocol=ttrpc version=3 Jan 23 18:31:28.569452 systemd[1]: Started cri-containerd-f073a0cee2219455715969bbc5f1ea725ee359487a21f8164cf55c6b9b056b0b.scope - libcontainer container f073a0cee2219455715969bbc5f1ea725ee359487a21f8164cf55c6b9b056b0b. Jan 23 18:31:28.579000 audit: BPF prog-id=241 op=LOAD Jan 23 18:31:28.579000 audit: BPF prog-id=242 op=LOAD Jan 23 18:31:28.579000 audit[4752]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00010c238 a2=98 a3=0 items=0 ppid=4739 pid=4752 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:28.579000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6630373361306365653232313934353537313539363962626335663165 Jan 23 18:31:28.580000 audit: BPF prog-id=242 op=UNLOAD Jan 23 18:31:28.580000 audit[4752]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4739 pid=4752 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:28.580000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6630373361306365653232313934353537313539363962626335663165 Jan 23 18:31:28.580000 audit: BPF prog-id=243 op=LOAD Jan 23 18:31:28.580000 audit[4752]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00010c488 a2=98 a3=0 items=0 ppid=4739 pid=4752 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:28.580000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6630373361306365653232313934353537313539363962626335663165 Jan 23 18:31:28.580000 audit: BPF prog-id=244 op=LOAD Jan 23 18:31:28.580000 audit[4752]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00010c218 a2=98 a3=0 items=0 ppid=4739 pid=4752 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:28.580000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6630373361306365653232313934353537313539363962626335663165 Jan 23 18:31:28.580000 audit: BPF prog-id=244 op=UNLOAD Jan 23 18:31:28.580000 audit[4752]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4739 pid=4752 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:28.580000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6630373361306365653232313934353537313539363962626335663165 Jan 23 18:31:28.580000 audit: BPF prog-id=243 op=UNLOAD Jan 23 18:31:28.580000 audit[4752]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4739 pid=4752 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:28.580000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6630373361306365653232313934353537313539363962626335663165 Jan 23 18:31:28.580000 audit: BPF prog-id=245 op=LOAD Jan 23 18:31:28.580000 audit[4752]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00010c6e8 a2=98 a3=0 items=0 ppid=4739 pid=4752 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:28.580000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6630373361306365653232313934353537313539363962626335663165 Jan 23 18:31:28.619501 containerd[1690]: time="2026-01-23T18:31:28.619458695Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-779b7ffd49-pbrfh,Uid:7a5d89a2-c80e-4f56-8808-99252854603a,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"f073a0cee2219455715969bbc5f1ea725ee359487a21f8164cf55c6b9b056b0b\"" Jan 23 18:31:28.622209 containerd[1690]: time="2026-01-23T18:31:28.622117925Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 23 18:31:28.779009 kubelet[2894]: E0123 18:31:28.778874 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-j9dvq" podUID="e681e1b7-9935-4d75-8509-9acd7616e3d8" Jan 23 18:31:28.837456 systemd-networkd[1583]: calia510a9d4a53: Gained IPv6LL Jan 23 18:31:28.874000 audit[4778]: NETFILTER_CFG table=filter:134 family=2 entries=17 op=nft_register_rule pid=4778 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:31:28.874000 audit[4778]: SYSCALL arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7ffd1ae341d0 a2=0 a3=7ffd1ae341bc items=0 ppid=3032 pid=4778 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:28.874000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:31:28.880000 audit[4778]: NETFILTER_CFG table=nat:135 family=2 entries=35 op=nft_register_chain pid=4778 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:31:28.880000 audit[4778]: SYSCALL arch=c000003e syscall=46 success=yes exit=14196 a0=3 a1=7ffd1ae341d0 a2=0 a3=7ffd1ae341bc items=0 ppid=3032 pid=4778 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:28.880000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:31:28.950885 containerd[1690]: time="2026-01-23T18:31:28.950688602Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 18:31:28.951827 containerd[1690]: time="2026-01-23T18:31:28.951711062Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 23 18:31:28.951827 containerd[1690]: time="2026-01-23T18:31:28.951759690Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 23 18:31:28.951979 kubelet[2894]: E0123 18:31:28.951945 2894 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 18:31:28.952020 kubelet[2894]: E0123 18:31:28.951994 2894 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 18:31:28.952162 kubelet[2894]: E0123 18:31:28.952129 2894 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tn7s7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-779b7ffd49-pbrfh_calico-apiserver(7a5d89a2-c80e-4f56-8808-99252854603a): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 23 18:31:28.953773 kubelet[2894]: E0123 18:31:28.953737 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-779b7ffd49-pbrfh" podUID="7a5d89a2-c80e-4f56-8808-99252854603a" Jan 23 18:31:29.157509 systemd-networkd[1583]: cali911b1ab5997: Gained IPv6LL Jan 23 18:31:29.379198 containerd[1690]: time="2026-01-23T18:31:29.379139904Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-fpt6r,Uid:9d67030a-5092-4383-b1de-ac0c48bff4df,Namespace:kube-system,Attempt:0,}" Jan 23 18:31:29.485664 systemd-networkd[1583]: calie2c68970bba: Link UP Jan 23 18:31:29.487150 systemd-networkd[1583]: calie2c68970bba: Gained carrier Jan 23 18:31:29.503237 kubelet[2894]: I0123 18:31:29.503183 2894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-cmmgk" podStartSLOduration=44.503153056 podStartE2EDuration="44.503153056s" podCreationTimestamp="2026-01-23 18:30:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 18:31:28.808840665 +0000 UTC m=+51.516176232" watchObservedRunningTime="2026-01-23 18:31:29.503153056 +0000 UTC m=+52.210488621" Jan 23 18:31:29.504334 containerd[1690]: 2026-01-23 18:31:29.424 [INFO][4779] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--1--0--2--32611d5cc2-k8s-coredns--668d6bf9bc--fpt6r-eth0 coredns-668d6bf9bc- kube-system 9d67030a-5092-4383-b1de-ac0c48bff4df 807 0 2026-01-23 18:30:45 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4547-1-0-2-32611d5cc2 coredns-668d6bf9bc-fpt6r eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calie2c68970bba [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="d5a51840ccdb08f12308ea9a0ec29d614b65c4ebd43868125f831022c01a51e0" Namespace="kube-system" Pod="coredns-668d6bf9bc-fpt6r" WorkloadEndpoint="ci--4547--1--0--2--32611d5cc2-k8s-coredns--668d6bf9bc--fpt6r-" Jan 23 18:31:29.504334 containerd[1690]: 2026-01-23 18:31:29.424 [INFO][4779] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d5a51840ccdb08f12308ea9a0ec29d614b65c4ebd43868125f831022c01a51e0" Namespace="kube-system" Pod="coredns-668d6bf9bc-fpt6r" WorkloadEndpoint="ci--4547--1--0--2--32611d5cc2-k8s-coredns--668d6bf9bc--fpt6r-eth0" Jan 23 18:31:29.504334 containerd[1690]: 2026-01-23 18:31:29.447 [INFO][4791] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d5a51840ccdb08f12308ea9a0ec29d614b65c4ebd43868125f831022c01a51e0" HandleID="k8s-pod-network.d5a51840ccdb08f12308ea9a0ec29d614b65c4ebd43868125f831022c01a51e0" Workload="ci--4547--1--0--2--32611d5cc2-k8s-coredns--668d6bf9bc--fpt6r-eth0" Jan 23 18:31:29.504334 containerd[1690]: 2026-01-23 18:31:29.447 [INFO][4791] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="d5a51840ccdb08f12308ea9a0ec29d614b65c4ebd43868125f831022c01a51e0" HandleID="k8s-pod-network.d5a51840ccdb08f12308ea9a0ec29d614b65c4ebd43868125f831022c01a51e0" Workload="ci--4547--1--0--2--32611d5cc2-k8s-coredns--668d6bf9bc--fpt6r-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f280), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4547-1-0-2-32611d5cc2", "pod":"coredns-668d6bf9bc-fpt6r", "timestamp":"2026-01-23 18:31:29.447108786 +0000 UTC"}, Hostname:"ci-4547-1-0-2-32611d5cc2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 23 18:31:29.504334 containerd[1690]: 2026-01-23 18:31:29.447 [INFO][4791] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 23 18:31:29.504334 containerd[1690]: 2026-01-23 18:31:29.447 [INFO][4791] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 23 18:31:29.504334 containerd[1690]: 2026-01-23 18:31:29.447 [INFO][4791] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-1-0-2-32611d5cc2' Jan 23 18:31:29.504334 containerd[1690]: 2026-01-23 18:31:29.454 [INFO][4791] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.d5a51840ccdb08f12308ea9a0ec29d614b65c4ebd43868125f831022c01a51e0" host="ci-4547-1-0-2-32611d5cc2" Jan 23 18:31:29.504334 containerd[1690]: 2026-01-23 18:31:29.459 [INFO][4791] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-1-0-2-32611d5cc2" Jan 23 18:31:29.504334 containerd[1690]: 2026-01-23 18:31:29.463 [INFO][4791] ipam/ipam.go 511: Trying affinity for 192.168.28.128/26 host="ci-4547-1-0-2-32611d5cc2" Jan 23 18:31:29.504334 containerd[1690]: 2026-01-23 18:31:29.464 [INFO][4791] ipam/ipam.go 158: Attempting to load block cidr=192.168.28.128/26 host="ci-4547-1-0-2-32611d5cc2" Jan 23 18:31:29.504334 containerd[1690]: 2026-01-23 18:31:29.466 [INFO][4791] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.28.128/26 host="ci-4547-1-0-2-32611d5cc2" Jan 23 18:31:29.504334 containerd[1690]: 2026-01-23 18:31:29.466 [INFO][4791] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.28.128/26 handle="k8s-pod-network.d5a51840ccdb08f12308ea9a0ec29d614b65c4ebd43868125f831022c01a51e0" host="ci-4547-1-0-2-32611d5cc2" Jan 23 18:31:29.504334 containerd[1690]: 2026-01-23 18:31:29.467 [INFO][4791] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.d5a51840ccdb08f12308ea9a0ec29d614b65c4ebd43868125f831022c01a51e0 Jan 23 18:31:29.504334 containerd[1690]: 2026-01-23 18:31:29.474 [INFO][4791] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.28.128/26 handle="k8s-pod-network.d5a51840ccdb08f12308ea9a0ec29d614b65c4ebd43868125f831022c01a51e0" host="ci-4547-1-0-2-32611d5cc2" Jan 23 18:31:29.504334 containerd[1690]: 2026-01-23 18:31:29.481 [INFO][4791] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.28.136/26] block=192.168.28.128/26 handle="k8s-pod-network.d5a51840ccdb08f12308ea9a0ec29d614b65c4ebd43868125f831022c01a51e0" host="ci-4547-1-0-2-32611d5cc2" Jan 23 18:31:29.504334 containerd[1690]: 2026-01-23 18:31:29.481 [INFO][4791] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.28.136/26] handle="k8s-pod-network.d5a51840ccdb08f12308ea9a0ec29d614b65c4ebd43868125f831022c01a51e0" host="ci-4547-1-0-2-32611d5cc2" Jan 23 18:31:29.504334 containerd[1690]: 2026-01-23 18:31:29.481 [INFO][4791] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 23 18:31:29.504334 containerd[1690]: 2026-01-23 18:31:29.481 [INFO][4791] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.28.136/26] IPv6=[] ContainerID="d5a51840ccdb08f12308ea9a0ec29d614b65c4ebd43868125f831022c01a51e0" HandleID="k8s-pod-network.d5a51840ccdb08f12308ea9a0ec29d614b65c4ebd43868125f831022c01a51e0" Workload="ci--4547--1--0--2--32611d5cc2-k8s-coredns--668d6bf9bc--fpt6r-eth0" Jan 23 18:31:29.505576 containerd[1690]: 2026-01-23 18:31:29.483 [INFO][4779] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d5a51840ccdb08f12308ea9a0ec29d614b65c4ebd43868125f831022c01a51e0" Namespace="kube-system" Pod="coredns-668d6bf9bc-fpt6r" WorkloadEndpoint="ci--4547--1--0--2--32611d5cc2-k8s-coredns--668d6bf9bc--fpt6r-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--1--0--2--32611d5cc2-k8s-coredns--668d6bf9bc--fpt6r-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"9d67030a-5092-4383-b1de-ac0c48bff4df", ResourceVersion:"807", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 18, 30, 45, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-1-0-2-32611d5cc2", ContainerID:"", Pod:"coredns-668d6bf9bc-fpt6r", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.28.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calie2c68970bba", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 18:31:29.505576 containerd[1690]: 2026-01-23 18:31:29.483 [INFO][4779] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.28.136/32] ContainerID="d5a51840ccdb08f12308ea9a0ec29d614b65c4ebd43868125f831022c01a51e0" Namespace="kube-system" Pod="coredns-668d6bf9bc-fpt6r" WorkloadEndpoint="ci--4547--1--0--2--32611d5cc2-k8s-coredns--668d6bf9bc--fpt6r-eth0" Jan 23 18:31:29.505576 containerd[1690]: 2026-01-23 18:31:29.483 [INFO][4779] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie2c68970bba ContainerID="d5a51840ccdb08f12308ea9a0ec29d614b65c4ebd43868125f831022c01a51e0" Namespace="kube-system" Pod="coredns-668d6bf9bc-fpt6r" WorkloadEndpoint="ci--4547--1--0--2--32611d5cc2-k8s-coredns--668d6bf9bc--fpt6r-eth0" Jan 23 18:31:29.505576 containerd[1690]: 2026-01-23 18:31:29.487 [INFO][4779] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d5a51840ccdb08f12308ea9a0ec29d614b65c4ebd43868125f831022c01a51e0" Namespace="kube-system" Pod="coredns-668d6bf9bc-fpt6r" WorkloadEndpoint="ci--4547--1--0--2--32611d5cc2-k8s-coredns--668d6bf9bc--fpt6r-eth0" Jan 23 18:31:29.505576 containerd[1690]: 2026-01-23 18:31:29.488 [INFO][4779] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d5a51840ccdb08f12308ea9a0ec29d614b65c4ebd43868125f831022c01a51e0" Namespace="kube-system" Pod="coredns-668d6bf9bc-fpt6r" WorkloadEndpoint="ci--4547--1--0--2--32611d5cc2-k8s-coredns--668d6bf9bc--fpt6r-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--1--0--2--32611d5cc2-k8s-coredns--668d6bf9bc--fpt6r-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"9d67030a-5092-4383-b1de-ac0c48bff4df", ResourceVersion:"807", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 18, 30, 45, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-1-0-2-32611d5cc2", ContainerID:"d5a51840ccdb08f12308ea9a0ec29d614b65c4ebd43868125f831022c01a51e0", Pod:"coredns-668d6bf9bc-fpt6r", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.28.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calie2c68970bba", MAC:"b6:47:2e:56:52:66", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 18:31:29.505576 containerd[1690]: 2026-01-23 18:31:29.500 [INFO][4779] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d5a51840ccdb08f12308ea9a0ec29d614b65c4ebd43868125f831022c01a51e0" Namespace="kube-system" Pod="coredns-668d6bf9bc-fpt6r" WorkloadEndpoint="ci--4547--1--0--2--32611d5cc2-k8s-coredns--668d6bf9bc--fpt6r-eth0" Jan 23 18:31:29.518000 audit[4805]: NETFILTER_CFG table=filter:136 family=2 entries=48 op=nft_register_chain pid=4805 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 23 18:31:29.518000 audit[4805]: SYSCALL arch=c000003e syscall=46 success=yes exit=22688 a0=3 a1=7ffefa81f290 a2=0 a3=7ffefa81f27c items=0 ppid=4131 pid=4805 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:29.518000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 23 18:31:29.530321 containerd[1690]: time="2026-01-23T18:31:29.530280760Z" level=info msg="connecting to shim d5a51840ccdb08f12308ea9a0ec29d614b65c4ebd43868125f831022c01a51e0" address="unix:///run/containerd/s/88c1e3c3440441ab6903fb5108d1e153b0265d0d07b581444b25ae57b5b88934" namespace=k8s.io protocol=ttrpc version=3 Jan 23 18:31:29.562501 systemd[1]: Started cri-containerd-d5a51840ccdb08f12308ea9a0ec29d614b65c4ebd43868125f831022c01a51e0.scope - libcontainer container d5a51840ccdb08f12308ea9a0ec29d614b65c4ebd43868125f831022c01a51e0. Jan 23 18:31:29.572000 audit: BPF prog-id=246 op=LOAD Jan 23 18:31:29.572000 audit: BPF prog-id=247 op=LOAD Jan 23 18:31:29.572000 audit[4826]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=4814 pid=4826 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:29.572000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6435613531383430636364623038663132333038656139613065633239 Jan 23 18:31:29.573000 audit: BPF prog-id=247 op=UNLOAD Jan 23 18:31:29.573000 audit[4826]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4814 pid=4826 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:29.573000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6435613531383430636364623038663132333038656139613065633239 Jan 23 18:31:29.573000 audit: BPF prog-id=248 op=LOAD Jan 23 18:31:29.573000 audit[4826]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=4814 pid=4826 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:29.573000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6435613531383430636364623038663132333038656139613065633239 Jan 23 18:31:29.573000 audit: BPF prog-id=249 op=LOAD Jan 23 18:31:29.573000 audit[4826]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=4814 pid=4826 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:29.573000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6435613531383430636364623038663132333038656139613065633239 Jan 23 18:31:29.573000 audit: BPF prog-id=249 op=UNLOAD Jan 23 18:31:29.573000 audit[4826]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4814 pid=4826 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:29.573000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6435613531383430636364623038663132333038656139613065633239 Jan 23 18:31:29.573000 audit: BPF prog-id=248 op=UNLOAD Jan 23 18:31:29.573000 audit[4826]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4814 pid=4826 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:29.573000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6435613531383430636364623038663132333038656139613065633239 Jan 23 18:31:29.573000 audit: BPF prog-id=250 op=LOAD Jan 23 18:31:29.573000 audit[4826]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=4814 pid=4826 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:29.573000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6435613531383430636364623038663132333038656139613065633239 Jan 23 18:31:29.608419 containerd[1690]: time="2026-01-23T18:31:29.608373160Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-fpt6r,Uid:9d67030a-5092-4383-b1de-ac0c48bff4df,Namespace:kube-system,Attempt:0,} returns sandbox id \"d5a51840ccdb08f12308ea9a0ec29d614b65c4ebd43868125f831022c01a51e0\"" Jan 23 18:31:29.610966 containerd[1690]: time="2026-01-23T18:31:29.610915678Z" level=info msg="CreateContainer within sandbox \"d5a51840ccdb08f12308ea9a0ec29d614b65c4ebd43868125f831022c01a51e0\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 23 18:31:29.622410 containerd[1690]: time="2026-01-23T18:31:29.622369384Z" level=info msg="Container be78b2b5a0e8f64c8ec67cee28c69c85f01044c7e645ca7e5ab96728ca751287: CDI devices from CRI Config.CDIDevices: []" Jan 23 18:31:29.630221 containerd[1690]: time="2026-01-23T18:31:29.630172460Z" level=info msg="CreateContainer within sandbox \"d5a51840ccdb08f12308ea9a0ec29d614b65c4ebd43868125f831022c01a51e0\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"be78b2b5a0e8f64c8ec67cee28c69c85f01044c7e645ca7e5ab96728ca751287\"" Jan 23 18:31:29.631270 containerd[1690]: time="2026-01-23T18:31:29.631233811Z" level=info msg="StartContainer for \"be78b2b5a0e8f64c8ec67cee28c69c85f01044c7e645ca7e5ab96728ca751287\"" Jan 23 18:31:29.632223 containerd[1690]: time="2026-01-23T18:31:29.632187487Z" level=info msg="connecting to shim be78b2b5a0e8f64c8ec67cee28c69c85f01044c7e645ca7e5ab96728ca751287" address="unix:///run/containerd/s/88c1e3c3440441ab6903fb5108d1e153b0265d0d07b581444b25ae57b5b88934" protocol=ttrpc version=3 Jan 23 18:31:29.655486 systemd[1]: Started cri-containerd-be78b2b5a0e8f64c8ec67cee28c69c85f01044c7e645ca7e5ab96728ca751287.scope - libcontainer container be78b2b5a0e8f64c8ec67cee28c69c85f01044c7e645ca7e5ab96728ca751287. Jan 23 18:31:29.667000 audit: BPF prog-id=251 op=LOAD Jan 23 18:31:29.667000 audit: BPF prog-id=252 op=LOAD Jan 23 18:31:29.667000 audit[4852]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a8238 a2=98 a3=0 items=0 ppid=4814 pid=4852 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:29.667000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6265373862326235613065386636346338656336376365653238633639 Jan 23 18:31:29.668000 audit: BPF prog-id=252 op=UNLOAD Jan 23 18:31:29.668000 audit[4852]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4814 pid=4852 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:29.668000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6265373862326235613065386636346338656336376365653238633639 Jan 23 18:31:29.668000 audit: BPF prog-id=253 op=LOAD Jan 23 18:31:29.668000 audit[4852]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a8488 a2=98 a3=0 items=0 ppid=4814 pid=4852 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:29.668000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6265373862326235613065386636346338656336376365653238633639 Jan 23 18:31:29.668000 audit: BPF prog-id=254 op=LOAD Jan 23 18:31:29.668000 audit[4852]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c0001a8218 a2=98 a3=0 items=0 ppid=4814 pid=4852 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:29.668000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6265373862326235613065386636346338656336376365653238633639 Jan 23 18:31:29.668000 audit: BPF prog-id=254 op=UNLOAD Jan 23 18:31:29.668000 audit[4852]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4814 pid=4852 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:29.668000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6265373862326235613065386636346338656336376365653238633639 Jan 23 18:31:29.668000 audit: BPF prog-id=253 op=UNLOAD Jan 23 18:31:29.668000 audit[4852]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4814 pid=4852 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:29.668000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6265373862326235613065386636346338656336376365653238633639 Jan 23 18:31:29.668000 audit: BPF prog-id=255 op=LOAD Jan 23 18:31:29.668000 audit[4852]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a86e8 a2=98 a3=0 items=0 ppid=4814 pid=4852 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:29.668000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6265373862326235613065386636346338656336376365653238633639 Jan 23 18:31:29.686177 containerd[1690]: time="2026-01-23T18:31:29.686130877Z" level=info msg="StartContainer for \"be78b2b5a0e8f64c8ec67cee28c69c85f01044c7e645ca7e5ab96728ca751287\" returns successfully" Jan 23 18:31:29.790495 kubelet[2894]: E0123 18:31:29.790321 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-779b7ffd49-pbrfh" podUID="7a5d89a2-c80e-4f56-8808-99252854603a" Jan 23 18:31:29.792013 kubelet[2894]: E0123 18:31:29.791395 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-j9dvq" podUID="e681e1b7-9935-4d75-8509-9acd7616e3d8" Jan 23 18:31:29.797865 systemd-networkd[1583]: cali928040ed5c7: Gained IPv6LL Jan 23 18:31:29.812567 kubelet[2894]: I0123 18:31:29.812411 2894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-fpt6r" podStartSLOduration=44.812301705 podStartE2EDuration="44.812301705s" podCreationTimestamp="2026-01-23 18:30:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 18:31:29.808181805 +0000 UTC m=+52.515517373" watchObservedRunningTime="2026-01-23 18:31:29.812301705 +0000 UTC m=+52.519637271" Jan 23 18:31:29.939000 audit[4886]: NETFILTER_CFG table=filter:137 family=2 entries=14 op=nft_register_rule pid=4886 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:31:29.939000 audit[4886]: SYSCALL arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7fff7028a810 a2=0 a3=7fff7028a7fc items=0 ppid=3032 pid=4886 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:29.939000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:31:29.944000 audit[4886]: NETFILTER_CFG table=nat:138 family=2 entries=44 op=nft_register_rule pid=4886 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:31:29.944000 audit[4886]: SYSCALL arch=c000003e syscall=46 success=yes exit=14196 a0=3 a1=7fff7028a810 a2=0 a3=7fff7028a7fc items=0 ppid=3032 pid=4886 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:29.944000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:31:30.501941 systemd-networkd[1583]: calie2c68970bba: Gained IPv6LL Jan 23 18:31:30.967000 audit[4893]: NETFILTER_CFG table=filter:139 family=2 entries=14 op=nft_register_rule pid=4893 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:31:30.967000 audit[4893]: SYSCALL arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7ffff322a7e0 a2=0 a3=7ffff322a7cc items=0 ppid=3032 pid=4893 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:30.967000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:31:30.980000 audit[4893]: NETFILTER_CFG table=nat:140 family=2 entries=56 op=nft_register_chain pid=4893 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:31:30.980000 audit[4893]: SYSCALL arch=c000003e syscall=46 success=yes exit=19860 a0=3 a1=7ffff322a7e0 a2=0 a3=7ffff322a7cc items=0 ppid=3032 pid=4893 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:30.980000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:31:38.378646 containerd[1690]: time="2026-01-23T18:31:38.378475729Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 23 18:31:38.728533 containerd[1690]: time="2026-01-23T18:31:38.728334649Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 18:31:38.729575 containerd[1690]: time="2026-01-23T18:31:38.729546351Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 23 18:31:38.729723 containerd[1690]: time="2026-01-23T18:31:38.729639427Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 23 18:31:38.729777 kubelet[2894]: E0123 18:31:38.729747 2894 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 23 18:31:38.730055 kubelet[2894]: E0123 18:31:38.729790 2894 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 23 18:31:38.730055 kubelet[2894]: E0123 18:31:38.729901 2894 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:9dc376b3bb22419f9d12e1d73f9667ce,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qvjpk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-775b766696-sj9gw_calico-system(a7bedb6c-04ad-4dfc-97e0-53467bd29e69): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 23 18:31:38.732557 containerd[1690]: time="2026-01-23T18:31:38.732507320Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 23 18:31:39.075755 containerd[1690]: time="2026-01-23T18:31:39.075509199Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 18:31:39.078362 containerd[1690]: time="2026-01-23T18:31:39.078238754Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 23 18:31:39.078362 containerd[1690]: time="2026-01-23T18:31:39.078287709Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 23 18:31:39.079030 kubelet[2894]: E0123 18:31:39.078654 2894 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 23 18:31:39.079030 kubelet[2894]: E0123 18:31:39.078704 2894 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 23 18:31:39.079030 kubelet[2894]: E0123 18:31:39.078832 2894 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qvjpk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-775b766696-sj9gw_calico-system(a7bedb6c-04ad-4dfc-97e0-53467bd29e69): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 23 18:31:39.080627 kubelet[2894]: E0123 18:31:39.080562 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-775b766696-sj9gw" podUID="a7bedb6c-04ad-4dfc-97e0-53467bd29e69" Jan 23 18:31:40.377898 containerd[1690]: time="2026-01-23T18:31:40.377840629Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 23 18:31:40.757269 containerd[1690]: time="2026-01-23T18:31:40.756994223Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 18:31:40.758585 containerd[1690]: time="2026-01-23T18:31:40.758465717Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 23 18:31:40.758585 containerd[1690]: time="2026-01-23T18:31:40.758557216Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 23 18:31:40.759312 containerd[1690]: time="2026-01-23T18:31:40.759036790Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 23 18:31:40.759345 kubelet[2894]: E0123 18:31:40.758689 2894 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 18:31:40.759345 kubelet[2894]: E0123 18:31:40.758740 2894 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 18:31:40.759345 kubelet[2894]: E0123 18:31:40.758975 2894 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tn7s7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-779b7ffd49-pbrfh_calico-apiserver(7a5d89a2-c80e-4f56-8808-99252854603a): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 23 18:31:40.760830 kubelet[2894]: E0123 18:31:40.760794 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-779b7ffd49-pbrfh" podUID="7a5d89a2-c80e-4f56-8808-99252854603a" Jan 23 18:31:41.073566 containerd[1690]: time="2026-01-23T18:31:41.073415501Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 18:31:41.074922 containerd[1690]: time="2026-01-23T18:31:41.074880084Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 23 18:31:41.075014 containerd[1690]: time="2026-01-23T18:31:41.074989302Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 23 18:31:41.075190 kubelet[2894]: E0123 18:31:41.075147 2894 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 23 18:31:41.075236 kubelet[2894]: E0123 18:31:41.075201 2894 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 23 18:31:41.075375 kubelet[2894]: E0123 18:31:41.075341 2894 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cprcl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-j9dvq_calico-system(e681e1b7-9935-4d75-8509-9acd7616e3d8): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 23 18:31:41.078142 containerd[1690]: time="2026-01-23T18:31:41.078103420Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 23 18:31:41.404977 containerd[1690]: time="2026-01-23T18:31:41.404927432Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 18:31:41.406806 containerd[1690]: time="2026-01-23T18:31:41.406771448Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 23 18:31:41.406996 kubelet[2894]: E0123 18:31:41.406964 2894 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 23 18:31:41.407073 containerd[1690]: time="2026-01-23T18:31:41.406859739Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 23 18:31:41.407255 kubelet[2894]: E0123 18:31:41.407008 2894 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 23 18:31:41.408578 kubelet[2894]: E0123 18:31:41.407573 2894 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cprcl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-j9dvq_calico-system(e681e1b7-9935-4d75-8509-9acd7616e3d8): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 23 18:31:41.408750 containerd[1690]: time="2026-01-23T18:31:41.408496521Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 23 18:31:41.409235 kubelet[2894]: E0123 18:31:41.409199 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-j9dvq" podUID="e681e1b7-9935-4d75-8509-9acd7616e3d8" Jan 23 18:31:41.768805 containerd[1690]: time="2026-01-23T18:31:41.768534272Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 18:31:41.769771 containerd[1690]: time="2026-01-23T18:31:41.769673113Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 23 18:31:41.769771 containerd[1690]: time="2026-01-23T18:31:41.769732370Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 23 18:31:41.769954 kubelet[2894]: E0123 18:31:41.769883 2894 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 18:31:41.769954 kubelet[2894]: E0123 18:31:41.769933 2894 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 18:31:41.770279 kubelet[2894]: E0123 18:31:41.770167 2894 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cnptr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-779b7ffd49-78hkk_calico-apiserver(644e9aff-fbfa-4d8d-bb83-94a0bb426243): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 23 18:31:41.770809 containerd[1690]: time="2026-01-23T18:31:41.770601522Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 23 18:31:41.771308 kubelet[2894]: E0123 18:31:41.771251 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-779b7ffd49-78hkk" podUID="644e9aff-fbfa-4d8d-bb83-94a0bb426243" Jan 23 18:31:42.114991 containerd[1690]: time="2026-01-23T18:31:42.114779611Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 18:31:42.116044 containerd[1690]: time="2026-01-23T18:31:42.115950144Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 23 18:31:42.116108 containerd[1690]: time="2026-01-23T18:31:42.116030038Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 23 18:31:42.116325 kubelet[2894]: E0123 18:31:42.116284 2894 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 23 18:31:42.116386 kubelet[2894]: E0123 18:31:42.116341 2894 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 23 18:31:42.117096 kubelet[2894]: E0123 18:31:42.117047 2894 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mzml4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-9xqc6_calico-system(dad22ee7-a9d6-4858-9e53-0db48fecba12): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 23 18:31:42.118365 kubelet[2894]: E0123 18:31:42.118330 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-9xqc6" podUID="dad22ee7-a9d6-4858-9e53-0db48fecba12" Jan 23 18:31:42.378280 containerd[1690]: time="2026-01-23T18:31:42.378068749Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 23 18:31:42.735038 containerd[1690]: time="2026-01-23T18:31:42.734893195Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 18:31:42.736825 containerd[1690]: time="2026-01-23T18:31:42.736762799Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 23 18:31:42.736825 containerd[1690]: time="2026-01-23T18:31:42.736797324Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 23 18:31:42.737085 kubelet[2894]: E0123 18:31:42.737036 2894 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 23 18:31:42.737131 kubelet[2894]: E0123 18:31:42.737095 2894 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 23 18:31:42.737296 kubelet[2894]: E0123 18:31:42.737244 2894 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cn2n5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-8447c9595f-2lbtc_calico-system(be525a6b-06ca-4032-a777-c6e0f1c5eb71): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 23 18:31:42.738463 kubelet[2894]: E0123 18:31:42.738441 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-8447c9595f-2lbtc" podUID="be525a6b-06ca-4032-a777-c6e0f1c5eb71" Jan 23 18:31:51.378052 kubelet[2894]: E0123 18:31:51.377860 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-779b7ffd49-pbrfh" podUID="7a5d89a2-c80e-4f56-8808-99252854603a" Jan 23 18:31:51.379401 kubelet[2894]: E0123 18:31:51.379347 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-775b766696-sj9gw" podUID="a7bedb6c-04ad-4dfc-97e0-53467bd29e69" Jan 23 18:31:53.378881 kubelet[2894]: E0123 18:31:53.378587 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-9xqc6" podUID="dad22ee7-a9d6-4858-9e53-0db48fecba12" Jan 23 18:31:53.379869 kubelet[2894]: E0123 18:31:53.379827 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-j9dvq" podUID="e681e1b7-9935-4d75-8509-9acd7616e3d8" Jan 23 18:31:54.377628 kubelet[2894]: E0123 18:31:54.377579 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-8447c9595f-2lbtc" podUID="be525a6b-06ca-4032-a777-c6e0f1c5eb71" Jan 23 18:31:56.378474 kubelet[2894]: E0123 18:31:56.378223 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-779b7ffd49-78hkk" podUID="644e9aff-fbfa-4d8d-bb83-94a0bb426243" Jan 23 18:32:04.377566 containerd[1690]: time="2026-01-23T18:32:04.377522704Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 23 18:32:04.726044 containerd[1690]: time="2026-01-23T18:32:04.725218619Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 18:32:04.726909 containerd[1690]: time="2026-01-23T18:32:04.726882189Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 23 18:32:04.727204 containerd[1690]: time="2026-01-23T18:32:04.727064808Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 23 18:32:04.727394 kubelet[2894]: E0123 18:32:04.727354 2894 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 23 18:32:04.727655 kubelet[2894]: E0123 18:32:04.727401 2894 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 23 18:32:04.727682 kubelet[2894]: E0123 18:32:04.727628 2894 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mzml4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-9xqc6_calico-system(dad22ee7-a9d6-4858-9e53-0db48fecba12): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 23 18:32:04.728145 containerd[1690]: time="2026-01-23T18:32:04.728100498Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 23 18:32:04.729691 kubelet[2894]: E0123 18:32:04.729665 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-9xqc6" podUID="dad22ee7-a9d6-4858-9e53-0db48fecba12" Jan 23 18:32:05.055346 containerd[1690]: time="2026-01-23T18:32:05.055191479Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 18:32:05.057796 containerd[1690]: time="2026-01-23T18:32:05.057759700Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 23 18:32:05.057919 containerd[1690]: time="2026-01-23T18:32:05.057834520Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 23 18:32:05.058231 kubelet[2894]: E0123 18:32:05.058049 2894 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 23 18:32:05.058231 kubelet[2894]: E0123 18:32:05.058106 2894 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 23 18:32:05.058410 containerd[1690]: time="2026-01-23T18:32:05.058393980Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 23 18:32:05.058663 kubelet[2894]: E0123 18:32:05.058614 2894 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:9dc376b3bb22419f9d12e1d73f9667ce,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qvjpk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-775b766696-sj9gw_calico-system(a7bedb6c-04ad-4dfc-97e0-53467bd29e69): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 23 18:32:05.395726 containerd[1690]: time="2026-01-23T18:32:05.395673491Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 18:32:05.396723 containerd[1690]: time="2026-01-23T18:32:05.396695521Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 23 18:32:05.396770 containerd[1690]: time="2026-01-23T18:32:05.396758941Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 23 18:32:05.397064 kubelet[2894]: E0123 18:32:05.396898 2894 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 18:32:05.397064 kubelet[2894]: E0123 18:32:05.396941 2894 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 18:32:05.397311 kubelet[2894]: E0123 18:32:05.397209 2894 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tn7s7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-779b7ffd49-pbrfh_calico-apiserver(7a5d89a2-c80e-4f56-8808-99252854603a): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 23 18:32:05.399023 kubelet[2894]: E0123 18:32:05.398734 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-779b7ffd49-pbrfh" podUID="7a5d89a2-c80e-4f56-8808-99252854603a" Jan 23 18:32:05.399060 containerd[1690]: time="2026-01-23T18:32:05.397631428Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 23 18:32:05.758458 containerd[1690]: time="2026-01-23T18:32:05.758348087Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 18:32:05.760215 containerd[1690]: time="2026-01-23T18:32:05.760187528Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 23 18:32:05.761005 containerd[1690]: time="2026-01-23T18:32:05.760306826Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 23 18:32:05.761255 kubelet[2894]: E0123 18:32:05.761224 2894 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 23 18:32:05.761576 kubelet[2894]: E0123 18:32:05.761558 2894 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 23 18:32:05.762562 kubelet[2894]: E0123 18:32:05.762528 2894 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qvjpk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-775b766696-sj9gw_calico-system(a7bedb6c-04ad-4dfc-97e0-53467bd29e69): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 23 18:32:05.763371 containerd[1690]: time="2026-01-23T18:32:05.763350278Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 23 18:32:05.764501 kubelet[2894]: E0123 18:32:05.764468 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-775b766696-sj9gw" podUID="a7bedb6c-04ad-4dfc-97e0-53467bd29e69" Jan 23 18:32:06.101116 containerd[1690]: time="2026-01-23T18:32:06.101009910Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 18:32:06.102997 containerd[1690]: time="2026-01-23T18:32:06.102958587Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 23 18:32:06.103113 containerd[1690]: time="2026-01-23T18:32:06.102975140Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 23 18:32:06.103339 kubelet[2894]: E0123 18:32:06.103311 2894 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 23 18:32:06.103387 kubelet[2894]: E0123 18:32:06.103355 2894 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 23 18:32:06.103663 kubelet[2894]: E0123 18:32:06.103458 2894 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cprcl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-j9dvq_calico-system(e681e1b7-9935-4d75-8509-9acd7616e3d8): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 23 18:32:06.106775 containerd[1690]: time="2026-01-23T18:32:06.106610296Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 23 18:32:06.518249 containerd[1690]: time="2026-01-23T18:32:06.518095318Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 18:32:06.520192 containerd[1690]: time="2026-01-23T18:32:06.520003077Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 23 18:32:06.520192 containerd[1690]: time="2026-01-23T18:32:06.520093404Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 23 18:32:06.520289 kubelet[2894]: E0123 18:32:06.520226 2894 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 23 18:32:06.520289 kubelet[2894]: E0123 18:32:06.520282 2894 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 23 18:32:06.521174 kubelet[2894]: E0123 18:32:06.521082 2894 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cprcl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-j9dvq_calico-system(e681e1b7-9935-4d75-8509-9acd7616e3d8): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 23 18:32:06.522295 kubelet[2894]: E0123 18:32:06.522230 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-j9dvq" podUID="e681e1b7-9935-4d75-8509-9acd7616e3d8" Jan 23 18:32:07.380277 containerd[1690]: time="2026-01-23T18:32:07.380221204Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 23 18:32:07.721182 containerd[1690]: time="2026-01-23T18:32:07.721069955Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 18:32:07.722486 containerd[1690]: time="2026-01-23T18:32:07.722453799Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 23 18:32:07.722559 containerd[1690]: time="2026-01-23T18:32:07.722523568Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 23 18:32:07.722661 kubelet[2894]: E0123 18:32:07.722628 2894 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 23 18:32:07.722862 kubelet[2894]: E0123 18:32:07.722675 2894 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 23 18:32:07.722983 kubelet[2894]: E0123 18:32:07.722945 2894 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cn2n5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-8447c9595f-2lbtc_calico-system(be525a6b-06ca-4032-a777-c6e0f1c5eb71): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 23 18:32:07.724784 kubelet[2894]: E0123 18:32:07.724742 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-8447c9595f-2lbtc" podUID="be525a6b-06ca-4032-a777-c6e0f1c5eb71" Jan 23 18:32:09.380933 containerd[1690]: time="2026-01-23T18:32:09.380352553Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 23 18:32:09.704510 containerd[1690]: time="2026-01-23T18:32:09.704405195Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 18:32:09.705797 containerd[1690]: time="2026-01-23T18:32:09.705759366Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 23 18:32:09.705848 containerd[1690]: time="2026-01-23T18:32:09.705835591Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 23 18:32:09.706188 kubelet[2894]: E0123 18:32:09.705971 2894 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 18:32:09.706188 kubelet[2894]: E0123 18:32:09.706018 2894 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 18:32:09.706188 kubelet[2894]: E0123 18:32:09.706144 2894 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cnptr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-779b7ffd49-78hkk_calico-apiserver(644e9aff-fbfa-4d8d-bb83-94a0bb426243): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 23 18:32:09.707362 kubelet[2894]: E0123 18:32:09.707319 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-779b7ffd49-78hkk" podUID="644e9aff-fbfa-4d8d-bb83-94a0bb426243" Jan 23 18:32:15.379024 kubelet[2894]: E0123 18:32:15.378786 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-9xqc6" podUID="dad22ee7-a9d6-4858-9e53-0db48fecba12" Jan 23 18:32:17.378843 kubelet[2894]: E0123 18:32:17.378735 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-j9dvq" podUID="e681e1b7-9935-4d75-8509-9acd7616e3d8" Jan 23 18:32:19.379332 kubelet[2894]: E0123 18:32:19.379293 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-775b766696-sj9gw" podUID="a7bedb6c-04ad-4dfc-97e0-53467bd29e69" Jan 23 18:32:20.377503 kubelet[2894]: E0123 18:32:20.377470 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-779b7ffd49-pbrfh" podUID="7a5d89a2-c80e-4f56-8808-99252854603a" Jan 23 18:32:20.378484 kubelet[2894]: E0123 18:32:20.377698 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-779b7ffd49-78hkk" podUID="644e9aff-fbfa-4d8d-bb83-94a0bb426243" Jan 23 18:32:20.378826 kubelet[2894]: E0123 18:32:20.378800 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-8447c9595f-2lbtc" podUID="be525a6b-06ca-4032-a777-c6e0f1c5eb71" Jan 23 18:32:28.377630 kubelet[2894]: E0123 18:32:28.377497 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-9xqc6" podUID="dad22ee7-a9d6-4858-9e53-0db48fecba12" Jan 23 18:32:31.379248 kubelet[2894]: E0123 18:32:31.379152 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-j9dvq" podUID="e681e1b7-9935-4d75-8509-9acd7616e3d8" Jan 23 18:32:32.378206 kubelet[2894]: E0123 18:32:32.378158 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-775b766696-sj9gw" podUID="a7bedb6c-04ad-4dfc-97e0-53467bd29e69" Jan 23 18:32:33.378081 kubelet[2894]: E0123 18:32:33.378041 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-8447c9595f-2lbtc" podUID="be525a6b-06ca-4032-a777-c6e0f1c5eb71" Jan 23 18:32:34.377552 kubelet[2894]: E0123 18:32:34.377312 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-779b7ffd49-78hkk" podUID="644e9aff-fbfa-4d8d-bb83-94a0bb426243" Jan 23 18:32:35.379850 kubelet[2894]: E0123 18:32:35.379819 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-779b7ffd49-pbrfh" podUID="7a5d89a2-c80e-4f56-8808-99252854603a" Jan 23 18:32:43.378278 kubelet[2894]: E0123 18:32:43.378228 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-9xqc6" podUID="dad22ee7-a9d6-4858-9e53-0db48fecba12" Jan 23 18:32:43.379733 kubelet[2894]: E0123 18:32:43.379698 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-j9dvq" podUID="e681e1b7-9935-4d75-8509-9acd7616e3d8" Jan 23 18:32:46.378472 kubelet[2894]: E0123 18:32:46.377686 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-8447c9595f-2lbtc" podUID="be525a6b-06ca-4032-a777-c6e0f1c5eb71" Jan 23 18:32:46.378472 kubelet[2894]: E0123 18:32:46.377823 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-779b7ffd49-78hkk" podUID="644e9aff-fbfa-4d8d-bb83-94a0bb426243" Jan 23 18:32:46.379334 containerd[1690]: time="2026-01-23T18:32:46.379307268Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 23 18:32:46.723029 containerd[1690]: time="2026-01-23T18:32:46.722838185Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 18:32:46.724104 containerd[1690]: time="2026-01-23T18:32:46.724073731Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 23 18:32:46.724175 containerd[1690]: time="2026-01-23T18:32:46.724146735Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 23 18:32:46.724435 kubelet[2894]: E0123 18:32:46.724248 2894 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 23 18:32:46.724435 kubelet[2894]: E0123 18:32:46.724403 2894 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 23 18:32:46.724522 kubelet[2894]: E0123 18:32:46.724496 2894 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:9dc376b3bb22419f9d12e1d73f9667ce,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qvjpk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-775b766696-sj9gw_calico-system(a7bedb6c-04ad-4dfc-97e0-53467bd29e69): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 23 18:32:46.726451 containerd[1690]: time="2026-01-23T18:32:46.726434535Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 23 18:32:47.049390 containerd[1690]: time="2026-01-23T18:32:47.049042670Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 18:32:47.050172 containerd[1690]: time="2026-01-23T18:32:47.050132986Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 23 18:32:47.050220 containerd[1690]: time="2026-01-23T18:32:47.050208143Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 23 18:32:47.050408 kubelet[2894]: E0123 18:32:47.050370 2894 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 23 18:32:47.050446 kubelet[2894]: E0123 18:32:47.050416 2894 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 23 18:32:47.050535 kubelet[2894]: E0123 18:32:47.050508 2894 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qvjpk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-775b766696-sj9gw_calico-system(a7bedb6c-04ad-4dfc-97e0-53467bd29e69): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 23 18:32:47.051916 kubelet[2894]: E0123 18:32:47.051864 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-775b766696-sj9gw" podUID="a7bedb6c-04ad-4dfc-97e0-53467bd29e69" Jan 23 18:32:50.377959 containerd[1690]: time="2026-01-23T18:32:50.377866943Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 23 18:32:50.696589 containerd[1690]: time="2026-01-23T18:32:50.696369286Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 18:32:50.697803 containerd[1690]: time="2026-01-23T18:32:50.697716375Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 23 18:32:50.697803 containerd[1690]: time="2026-01-23T18:32:50.697785054Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 23 18:32:50.698018 kubelet[2894]: E0123 18:32:50.697988 2894 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 18:32:50.698471 kubelet[2894]: E0123 18:32:50.698311 2894 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 18:32:50.698471 kubelet[2894]: E0123 18:32:50.698429 2894 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tn7s7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-779b7ffd49-pbrfh_calico-apiserver(7a5d89a2-c80e-4f56-8808-99252854603a): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 23 18:32:50.699594 kubelet[2894]: E0123 18:32:50.699564 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-779b7ffd49-pbrfh" podUID="7a5d89a2-c80e-4f56-8808-99252854603a" Jan 23 18:32:54.145776 systemd[1]: Started sshd@9-10.0.9.101:22-68.220.241.50:53272.service - OpenSSH per-connection server daemon (68.220.241.50:53272). Jan 23 18:32:54.144000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-10.0.9.101:22-68.220.241.50:53272 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:32:54.146463 kernel: kauditd_printk_skb: 164 callbacks suppressed Jan 23 18:32:54.146507 kernel: audit: type=1130 audit(1769193174.144:746): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-10.0.9.101:22-68.220.241.50:53272 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:32:54.378774 containerd[1690]: time="2026-01-23T18:32:54.378744150Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 23 18:32:54.678000 audit[5033]: USER_ACCT pid=5033 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:32:54.680046 sshd[5033]: Accepted publickey for core from 68.220.241.50 port 53272 ssh2: RSA SHA256:nKFqiC1tfI89CurykR2N2Qujx/39ZzIdQ7HqDt8w/Gw Jan 23 18:32:54.683298 kernel: audit: type=1101 audit(1769193174.678:747): pid=5033 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:32:54.679000 audit[5033]: CRED_ACQ pid=5033 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:32:54.683755 sshd-session[5033]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 18:32:54.687285 kernel: audit: type=1103 audit(1769193174.679:748): pid=5033 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:32:54.690279 kernel: audit: type=1006 audit(1769193174.679:749): pid=5033 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=11 res=1 Jan 23 18:32:54.679000 audit[5033]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe50b3e120 a2=3 a3=0 items=0 ppid=1 pid=5033 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=11 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:32:54.695277 kernel: audit: type=1300 audit(1769193174.679:749): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe50b3e120 a2=3 a3=0 items=0 ppid=1 pid=5033 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=11 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:32:54.679000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 18:32:54.697278 kernel: audit: type=1327 audit(1769193174.679:749): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 18:32:54.698913 systemd-logind[1659]: New session 11 of user core. Jan 23 18:32:54.703543 systemd[1]: Started session-11.scope - Session 11 of User core. Jan 23 18:32:54.706000 audit[5033]: USER_START pid=5033 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:32:54.710000 audit[5037]: CRED_ACQ pid=5037 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:32:54.713290 kernel: audit: type=1105 audit(1769193174.706:750): pid=5033 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:32:54.713350 kernel: audit: type=1103 audit(1769193174.710:751): pid=5037 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:32:54.825773 containerd[1690]: time="2026-01-23T18:32:54.825636438Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 18:32:54.826869 containerd[1690]: time="2026-01-23T18:32:54.826775207Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 23 18:32:54.826869 containerd[1690]: time="2026-01-23T18:32:54.826841019Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 23 18:32:54.827091 kubelet[2894]: E0123 18:32:54.827048 2894 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 23 18:32:54.828115 kubelet[2894]: E0123 18:32:54.827101 2894 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 23 18:32:54.828115 kubelet[2894]: E0123 18:32:54.827244 2894 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mzml4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-9xqc6_calico-system(dad22ee7-a9d6-4858-9e53-0db48fecba12): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 23 18:32:54.829232 kubelet[2894]: E0123 18:32:54.829200 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-9xqc6" podUID="dad22ee7-a9d6-4858-9e53-0db48fecba12" Jan 23 18:32:55.057534 sshd[5037]: Connection closed by 68.220.241.50 port 53272 Jan 23 18:32:55.060070 sshd-session[5033]: pam_unix(sshd:session): session closed for user core Jan 23 18:32:55.068255 kernel: audit: type=1106 audit(1769193175.060:752): pid=5033 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:32:55.060000 audit[5033]: USER_END pid=5033 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:32:55.066665 systemd[1]: sshd@9-10.0.9.101:22-68.220.241.50:53272.service: Deactivated successfully. Jan 23 18:32:55.068188 systemd[1]: session-11.scope: Deactivated successfully. Jan 23 18:32:55.071035 systemd-logind[1659]: Session 11 logged out. Waiting for processes to exit. Jan 23 18:32:55.073537 systemd-logind[1659]: Removed session 11. Jan 23 18:32:55.060000 audit[5033]: CRED_DISP pid=5033 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:32:55.078274 kernel: audit: type=1104 audit(1769193175.060:753): pid=5033 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:32:55.065000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-10.0.9.101:22-68.220.241.50:53272 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:32:57.378418 containerd[1690]: time="2026-01-23T18:32:57.378368426Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 23 18:32:57.710343 containerd[1690]: time="2026-01-23T18:32:57.710198570Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 18:32:57.711642 containerd[1690]: time="2026-01-23T18:32:57.711535497Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 23 18:32:57.711642 containerd[1690]: time="2026-01-23T18:32:57.711596958Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 23 18:32:57.711763 kubelet[2894]: E0123 18:32:57.711734 2894 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 18:32:57.712042 kubelet[2894]: E0123 18:32:57.711774 2894 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 18:32:57.712042 kubelet[2894]: E0123 18:32:57.711955 2894 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cnptr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-779b7ffd49-78hkk_calico-apiserver(644e9aff-fbfa-4d8d-bb83-94a0bb426243): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 23 18:32:57.712950 containerd[1690]: time="2026-01-23T18:32:57.712586095Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 23 18:32:57.713176 kubelet[2894]: E0123 18:32:57.713109 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-779b7ffd49-78hkk" podUID="644e9aff-fbfa-4d8d-bb83-94a0bb426243" Jan 23 18:32:58.033660 containerd[1690]: time="2026-01-23T18:32:58.033363160Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 18:32:58.034755 containerd[1690]: time="2026-01-23T18:32:58.034676256Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 23 18:32:58.034755 containerd[1690]: time="2026-01-23T18:32:58.034747750Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 23 18:32:58.034935 kubelet[2894]: E0123 18:32:58.034899 2894 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 23 18:32:58.034972 kubelet[2894]: E0123 18:32:58.034944 2894 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 23 18:32:58.035091 kubelet[2894]: E0123 18:32:58.035058 2894 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cn2n5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-8447c9595f-2lbtc_calico-system(be525a6b-06ca-4032-a777-c6e0f1c5eb71): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 23 18:32:58.036789 kubelet[2894]: E0123 18:32:58.036768 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-8447c9595f-2lbtc" podUID="be525a6b-06ca-4032-a777-c6e0f1c5eb71" Jan 23 18:32:58.377693 containerd[1690]: time="2026-01-23T18:32:58.377301989Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 23 18:32:58.710737 containerd[1690]: time="2026-01-23T18:32:58.710280155Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 18:32:58.711722 containerd[1690]: time="2026-01-23T18:32:58.711659452Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 23 18:32:58.712031 containerd[1690]: time="2026-01-23T18:32:58.711677480Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 23 18:32:58.712116 kubelet[2894]: E0123 18:32:58.712082 2894 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 23 18:32:58.712372 kubelet[2894]: E0123 18:32:58.712125 2894 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 23 18:32:58.712372 kubelet[2894]: E0123 18:32:58.712223 2894 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cprcl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-j9dvq_calico-system(e681e1b7-9935-4d75-8509-9acd7616e3d8): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 23 18:32:58.715273 containerd[1690]: time="2026-01-23T18:32:58.715163525Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 23 18:32:59.046113 containerd[1690]: time="2026-01-23T18:32:59.046012323Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 18:32:59.047344 containerd[1690]: time="2026-01-23T18:32:59.047314958Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 23 18:32:59.047645 containerd[1690]: time="2026-01-23T18:32:59.047378880Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 23 18:32:59.047724 kubelet[2894]: E0123 18:32:59.047686 2894 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 23 18:32:59.047774 kubelet[2894]: E0123 18:32:59.047736 2894 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 23 18:32:59.047888 kubelet[2894]: E0123 18:32:59.047836 2894 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cprcl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-j9dvq_calico-system(e681e1b7-9935-4d75-8509-9acd7616e3d8): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 23 18:32:59.049180 kubelet[2894]: E0123 18:32:59.049159 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-j9dvq" podUID="e681e1b7-9935-4d75-8509-9acd7616e3d8" Jan 23 18:32:59.378899 kubelet[2894]: E0123 18:32:59.378861 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-775b766696-sj9gw" podUID="a7bedb6c-04ad-4dfc-97e0-53467bd29e69" Jan 23 18:33:00.171231 systemd[1]: Started sshd@10-10.0.9.101:22-68.220.241.50:53276.service - OpenSSH per-connection server daemon (68.220.241.50:53276). Jan 23 18:33:00.176039 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 23 18:33:00.176076 kernel: audit: type=1130 audit(1769193180.170:755): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-10.0.9.101:22-68.220.241.50:53276 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:33:00.170000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-10.0.9.101:22-68.220.241.50:53276 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:33:00.709000 audit[5070]: USER_ACCT pid=5070 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:00.714611 sshd[5070]: Accepted publickey for core from 68.220.241.50 port 53276 ssh2: RSA SHA256:nKFqiC1tfI89CurykR2N2Qujx/39ZzIdQ7HqDt8w/Gw Jan 23 18:33:00.716230 kernel: audit: type=1101 audit(1769193180.709:756): pid=5070 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:00.716088 sshd-session[5070]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 18:33:00.714000 audit[5070]: CRED_ACQ pid=5070 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:00.723764 kernel: audit: type=1103 audit(1769193180.714:757): pid=5070 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:00.723819 kernel: audit: type=1006 audit(1769193180.714:758): pid=5070 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=12 res=1 Jan 23 18:33:00.731243 kernel: audit: type=1300 audit(1769193180.714:758): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffcaac63a90 a2=3 a3=0 items=0 ppid=1 pid=5070 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=12 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:33:00.714000 audit[5070]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffcaac63a90 a2=3 a3=0 items=0 ppid=1 pid=5070 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=12 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:33:00.736198 systemd-logind[1659]: New session 12 of user core. Jan 23 18:33:00.714000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 18:33:00.738975 systemd[1]: Started session-12.scope - Session 12 of User core. Jan 23 18:33:00.739685 kernel: audit: type=1327 audit(1769193180.714:758): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 18:33:00.742000 audit[5070]: USER_START pid=5070 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:00.749291 kernel: audit: type=1105 audit(1769193180.742:759): pid=5070 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:00.747000 audit[5074]: CRED_ACQ pid=5074 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:00.754274 kernel: audit: type=1103 audit(1769193180.747:760): pid=5074 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:01.129752 sshd[5074]: Connection closed by 68.220.241.50 port 53276 Jan 23 18:33:01.130414 sshd-session[5070]: pam_unix(sshd:session): session closed for user core Jan 23 18:33:01.131000 audit[5070]: USER_END pid=5070 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:01.140134 kernel: audit: type=1106 audit(1769193181.131:761): pid=5070 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:01.140206 kernel: audit: type=1104 audit(1769193181.131:762): pid=5070 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:01.131000 audit[5070]: CRED_DISP pid=5070 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:01.138225 systemd[1]: sshd@10-10.0.9.101:22-68.220.241.50:53276.service: Deactivated successfully. Jan 23 18:33:01.141011 systemd[1]: session-12.scope: Deactivated successfully. Jan 23 18:33:01.142438 systemd-logind[1659]: Session 12 logged out. Waiting for processes to exit. Jan 23 18:33:01.137000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-10.0.9.101:22-68.220.241.50:53276 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:33:01.143808 systemd-logind[1659]: Removed session 12. Jan 23 18:33:02.378483 kubelet[2894]: E0123 18:33:02.378400 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-779b7ffd49-pbrfh" podUID="7a5d89a2-c80e-4f56-8808-99252854603a" Jan 23 18:33:05.380189 kubelet[2894]: E0123 18:33:05.380152 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-9xqc6" podUID="dad22ee7-a9d6-4858-9e53-0db48fecba12" Jan 23 18:33:06.236131 systemd[1]: Started sshd@11-10.0.9.101:22-68.220.241.50:48354.service - OpenSSH per-connection server daemon (68.220.241.50:48354). Jan 23 18:33:06.236000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-10.0.9.101:22-68.220.241.50:48354 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:33:06.238867 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 23 18:33:06.238912 kernel: audit: type=1130 audit(1769193186.236:764): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-10.0.9.101:22-68.220.241.50:48354 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:33:06.763000 audit[5086]: USER_ACCT pid=5086 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:06.768244 sshd[5086]: Accepted publickey for core from 68.220.241.50 port 48354 ssh2: RSA SHA256:nKFqiC1tfI89CurykR2N2Qujx/39ZzIdQ7HqDt8w/Gw Jan 23 18:33:06.769479 sshd-session[5086]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 18:33:06.770337 kernel: audit: type=1101 audit(1769193186.763:765): pid=5086 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:06.767000 audit[5086]: CRED_ACQ pid=5086 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:06.774358 kernel: audit: type=1103 audit(1769193186.767:766): pid=5086 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:06.776888 systemd-logind[1659]: New session 13 of user core. Jan 23 18:33:06.767000 audit[5086]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffd765b12d0 a2=3 a3=0 items=0 ppid=1 pid=5086 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=13 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:33:06.779419 kernel: audit: type=1006 audit(1769193186.767:767): pid=5086 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=13 res=1 Jan 23 18:33:06.779467 kernel: audit: type=1300 audit(1769193186.767:767): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffd765b12d0 a2=3 a3=0 items=0 ppid=1 pid=5086 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=13 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:33:06.767000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 18:33:06.782325 kernel: audit: type=1327 audit(1769193186.767:767): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 18:33:06.783441 systemd[1]: Started session-13.scope - Session 13 of User core. Jan 23 18:33:06.785000 audit[5086]: USER_START pid=5086 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:06.790000 audit[5090]: CRED_ACQ pid=5090 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:06.793392 kernel: audit: type=1105 audit(1769193186.785:768): pid=5086 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:06.793447 kernel: audit: type=1103 audit(1769193186.790:769): pid=5090 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:07.157766 sshd[5090]: Connection closed by 68.220.241.50 port 48354 Jan 23 18:33:07.158230 sshd-session[5086]: pam_unix(sshd:session): session closed for user core Jan 23 18:33:07.158000 audit[5086]: USER_END pid=5086 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:07.163447 systemd[1]: sshd@11-10.0.9.101:22-68.220.241.50:48354.service: Deactivated successfully. Jan 23 18:33:07.165529 kernel: audit: type=1106 audit(1769193187.158:770): pid=5086 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:07.166323 systemd[1]: session-13.scope: Deactivated successfully. Jan 23 18:33:07.158000 audit[5086]: CRED_DISP pid=5086 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:07.169638 systemd-logind[1659]: Session 13 logged out. Waiting for processes to exit. Jan 23 18:33:07.170679 systemd-logind[1659]: Removed session 13. Jan 23 18:33:07.172297 kernel: audit: type=1104 audit(1769193187.158:771): pid=5086 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:07.161000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-10.0.9.101:22-68.220.241.50:48354 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:33:07.267000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-10.0.9.101:22-68.220.241.50:48362 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:33:07.268380 systemd[1]: Started sshd@12-10.0.9.101:22-68.220.241.50:48362.service - OpenSSH per-connection server daemon (68.220.241.50:48362). Jan 23 18:33:07.799000 audit[5102]: USER_ACCT pid=5102 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:07.801493 sshd[5102]: Accepted publickey for core from 68.220.241.50 port 48362 ssh2: RSA SHA256:nKFqiC1tfI89CurykR2N2Qujx/39ZzIdQ7HqDt8w/Gw Jan 23 18:33:07.800000 audit[5102]: CRED_ACQ pid=5102 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:07.801000 audit[5102]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffed89e4850 a2=3 a3=0 items=0 ppid=1 pid=5102 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=14 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:33:07.801000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 18:33:07.803022 sshd-session[5102]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 18:33:07.806853 systemd-logind[1659]: New session 14 of user core. Jan 23 18:33:07.815522 systemd[1]: Started session-14.scope - Session 14 of User core. Jan 23 18:33:07.817000 audit[5102]: USER_START pid=5102 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:07.818000 audit[5106]: CRED_ACQ pid=5106 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:08.187876 sshd[5106]: Connection closed by 68.220.241.50 port 48362 Jan 23 18:33:08.189381 sshd-session[5102]: pam_unix(sshd:session): session closed for user core Jan 23 18:33:08.189000 audit[5102]: USER_END pid=5102 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:08.189000 audit[5102]: CRED_DISP pid=5102 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:08.192770 systemd[1]: sshd@12-10.0.9.101:22-68.220.241.50:48362.service: Deactivated successfully. Jan 23 18:33:08.192000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-10.0.9.101:22-68.220.241.50:48362 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:33:08.194849 systemd[1]: session-14.scope: Deactivated successfully. Jan 23 18:33:08.196955 systemd-logind[1659]: Session 14 logged out. Waiting for processes to exit. Jan 23 18:33:08.197928 systemd-logind[1659]: Removed session 14. Jan 23 18:33:08.305000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-10.0.9.101:22-68.220.241.50:48368 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:33:08.306518 systemd[1]: Started sshd@13-10.0.9.101:22-68.220.241.50:48368.service - OpenSSH per-connection server daemon (68.220.241.50:48368). Jan 23 18:33:08.824000 audit[5116]: USER_ACCT pid=5116 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:08.826429 sshd[5116]: Accepted publickey for core from 68.220.241.50 port 48368 ssh2: RSA SHA256:nKFqiC1tfI89CurykR2N2Qujx/39ZzIdQ7HqDt8w/Gw Jan 23 18:33:08.825000 audit[5116]: CRED_ACQ pid=5116 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:08.826000 audit[5116]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc5fce64c0 a2=3 a3=0 items=0 ppid=1 pid=5116 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=15 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:33:08.826000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 18:33:08.828143 sshd-session[5116]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 18:33:08.834169 systemd-logind[1659]: New session 15 of user core. Jan 23 18:33:08.839442 systemd[1]: Started session-15.scope - Session 15 of User core. Jan 23 18:33:08.843000 audit[5116]: USER_START pid=5116 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:08.844000 audit[5120]: CRED_ACQ pid=5120 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:09.199204 sshd[5120]: Connection closed by 68.220.241.50 port 48368 Jan 23 18:33:09.199686 sshd-session[5116]: pam_unix(sshd:session): session closed for user core Jan 23 18:33:09.201000 audit[5116]: USER_END pid=5116 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:09.201000 audit[5116]: CRED_DISP pid=5116 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:09.205111 systemd[1]: sshd@13-10.0.9.101:22-68.220.241.50:48368.service: Deactivated successfully. Jan 23 18:33:09.204000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-10.0.9.101:22-68.220.241.50:48368 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:33:09.206751 systemd[1]: session-15.scope: Deactivated successfully. Jan 23 18:33:09.210319 systemd-logind[1659]: Session 15 logged out. Waiting for processes to exit. Jan 23 18:33:09.213945 systemd-logind[1659]: Removed session 15. Jan 23 18:33:10.380381 kubelet[2894]: E0123 18:33:10.380327 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-775b766696-sj9gw" podUID="a7bedb6c-04ad-4dfc-97e0-53467bd29e69" Jan 23 18:33:11.379208 kubelet[2894]: E0123 18:33:11.379181 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-779b7ffd49-78hkk" podUID="644e9aff-fbfa-4d8d-bb83-94a0bb426243" Jan 23 18:33:11.380095 kubelet[2894]: E0123 18:33:11.379732 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-j9dvq" podUID="e681e1b7-9935-4d75-8509-9acd7616e3d8" Jan 23 18:33:12.377685 kubelet[2894]: E0123 18:33:12.377622 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-8447c9595f-2lbtc" podUID="be525a6b-06ca-4032-a777-c6e0f1c5eb71" Jan 23 18:33:14.310495 systemd[1]: Started sshd@14-10.0.9.101:22-68.220.241.50:49796.service - OpenSSH per-connection server daemon (68.220.241.50:49796). Jan 23 18:33:14.314601 kernel: kauditd_printk_skb: 23 callbacks suppressed Jan 23 18:33:14.314655 kernel: audit: type=1130 audit(1769193194.309:791): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-10.0.9.101:22-68.220.241.50:49796 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:33:14.309000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-10.0.9.101:22-68.220.241.50:49796 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:33:14.836000 audit[5135]: USER_ACCT pid=5135 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:14.839987 sshd[5135]: Accepted publickey for core from 68.220.241.50 port 49796 ssh2: RSA SHA256:nKFqiC1tfI89CurykR2N2Qujx/39ZzIdQ7HqDt8w/Gw Jan 23 18:33:14.840630 sshd-session[5135]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 18:33:14.837000 audit[5135]: CRED_ACQ pid=5135 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:14.843842 kernel: audit: type=1101 audit(1769193194.836:792): pid=5135 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:14.843899 kernel: audit: type=1103 audit(1769193194.837:793): pid=5135 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:14.837000 audit[5135]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffcef869700 a2=3 a3=0 items=0 ppid=1 pid=5135 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=16 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:33:14.851115 kernel: audit: type=1006 audit(1769193194.837:794): pid=5135 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=16 res=1 Jan 23 18:33:14.851160 kernel: audit: type=1300 audit(1769193194.837:794): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffcef869700 a2=3 a3=0 items=0 ppid=1 pid=5135 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=16 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:33:14.837000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 18:33:14.854395 kernel: audit: type=1327 audit(1769193194.837:794): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 18:33:14.856094 systemd-logind[1659]: New session 16 of user core. Jan 23 18:33:14.864453 systemd[1]: Started session-16.scope - Session 16 of User core. Jan 23 18:33:14.866000 audit[5135]: USER_START pid=5135 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:14.871000 audit[5139]: CRED_ACQ pid=5139 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:14.874150 kernel: audit: type=1105 audit(1769193194.866:795): pid=5135 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:14.874193 kernel: audit: type=1103 audit(1769193194.871:796): pid=5139 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:15.197302 sshd[5139]: Connection closed by 68.220.241.50 port 49796 Jan 23 18:33:15.198575 sshd-session[5135]: pam_unix(sshd:session): session closed for user core Jan 23 18:33:15.199000 audit[5135]: USER_END pid=5135 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:15.203050 systemd[1]: sshd@14-10.0.9.101:22-68.220.241.50:49796.service: Deactivated successfully. Jan 23 18:33:15.205028 systemd[1]: session-16.scope: Deactivated successfully. Jan 23 18:33:15.206295 kernel: audit: type=1106 audit(1769193195.199:797): pid=5135 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:15.199000 audit[5135]: CRED_DISP pid=5135 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:15.210193 systemd-logind[1659]: Session 16 logged out. Waiting for processes to exit. Jan 23 18:33:15.210314 kernel: audit: type=1104 audit(1769193195.199:798): pid=5135 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:15.211904 systemd-logind[1659]: Removed session 16. Jan 23 18:33:15.202000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-10.0.9.101:22-68.220.241.50:49796 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:33:16.378266 kubelet[2894]: E0123 18:33:16.378020 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-9xqc6" podUID="dad22ee7-a9d6-4858-9e53-0db48fecba12" Jan 23 18:33:16.378266 kubelet[2894]: E0123 18:33:16.378020 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-779b7ffd49-pbrfh" podUID="7a5d89a2-c80e-4f56-8808-99252854603a" Jan 23 18:33:20.305000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-10.0.9.101:22-68.220.241.50:49810 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:33:20.306516 systemd[1]: Started sshd@15-10.0.9.101:22-68.220.241.50:49810.service - OpenSSH per-connection server daemon (68.220.241.50:49810). Jan 23 18:33:20.307639 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 23 18:33:20.307675 kernel: audit: type=1130 audit(1769193200.305:800): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-10.0.9.101:22-68.220.241.50:49810 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:33:20.834000 audit[5153]: USER_ACCT pid=5153 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:20.835825 sshd[5153]: Accepted publickey for core from 68.220.241.50 port 49810 ssh2: RSA SHA256:nKFqiC1tfI89CurykR2N2Qujx/39ZzIdQ7HqDt8w/Gw Jan 23 18:33:20.839634 sshd-session[5153]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 18:33:20.840289 kernel: audit: type=1101 audit(1769193200.834:801): pid=5153 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:20.836000 audit[5153]: CRED_ACQ pid=5153 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:20.844286 kernel: audit: type=1103 audit(1769193200.836:802): pid=5153 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:20.836000 audit[5153]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe8da80ad0 a2=3 a3=0 items=0 ppid=1 pid=5153 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=17 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:33:20.848900 kernel: audit: type=1006 audit(1769193200.836:803): pid=5153 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=17 res=1 Jan 23 18:33:20.848932 kernel: audit: type=1300 audit(1769193200.836:803): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe8da80ad0 a2=3 a3=0 items=0 ppid=1 pid=5153 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=17 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:33:20.836000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 18:33:20.853278 kernel: audit: type=1327 audit(1769193200.836:803): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 18:33:20.852949 systemd-logind[1659]: New session 17 of user core. Jan 23 18:33:20.865491 systemd[1]: Started session-17.scope - Session 17 of User core. Jan 23 18:33:20.867000 audit[5153]: USER_START pid=5153 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:20.870000 audit[5157]: CRED_ACQ pid=5157 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:20.874553 kernel: audit: type=1105 audit(1769193200.867:804): pid=5153 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:20.874616 kernel: audit: type=1103 audit(1769193200.870:805): pid=5157 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:21.184320 sshd[5157]: Connection closed by 68.220.241.50 port 49810 Jan 23 18:33:21.185104 sshd-session[5153]: pam_unix(sshd:session): session closed for user core Jan 23 18:33:21.184000 audit[5153]: USER_END pid=5153 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:21.188630 systemd[1]: sshd@15-10.0.9.101:22-68.220.241.50:49810.service: Deactivated successfully. Jan 23 18:33:21.191605 kernel: audit: type=1106 audit(1769193201.184:806): pid=5153 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:21.190587 systemd[1]: session-17.scope: Deactivated successfully. Jan 23 18:33:21.195314 kernel: audit: type=1104 audit(1769193201.184:807): pid=5153 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:21.184000 audit[5153]: CRED_DISP pid=5153 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:21.195544 systemd-logind[1659]: Session 17 logged out. Waiting for processes to exit. Jan 23 18:33:21.187000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-10.0.9.101:22-68.220.241.50:49810 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:33:21.197089 systemd-logind[1659]: Removed session 17. Jan 23 18:33:22.379493 kubelet[2894]: E0123 18:33:22.379441 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-j9dvq" podUID="e681e1b7-9935-4d75-8509-9acd7616e3d8" Jan 23 18:33:23.378066 kubelet[2894]: E0123 18:33:23.378030 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-779b7ffd49-78hkk" podUID="644e9aff-fbfa-4d8d-bb83-94a0bb426243" Jan 23 18:33:25.379679 kubelet[2894]: E0123 18:33:25.379628 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-775b766696-sj9gw" podUID="a7bedb6c-04ad-4dfc-97e0-53467bd29e69" Jan 23 18:33:26.286254 systemd[1]: Started sshd@16-10.0.9.101:22-68.220.241.50:57640.service - OpenSSH per-connection server daemon (68.220.241.50:57640). Jan 23 18:33:26.285000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-10.0.9.101:22-68.220.241.50:57640 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:33:26.290031 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 23 18:33:26.290090 kernel: audit: type=1130 audit(1769193206.285:809): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-10.0.9.101:22-68.220.241.50:57640 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:33:26.811000 audit[5194]: USER_ACCT pid=5194 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:26.815416 sshd[5194]: Accepted publickey for core from 68.220.241.50 port 57640 ssh2: RSA SHA256:nKFqiC1tfI89CurykR2N2Qujx/39ZzIdQ7HqDt8w/Gw Jan 23 18:33:26.818310 kernel: audit: type=1101 audit(1769193206.811:810): pid=5194 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:26.817000 audit[5194]: CRED_ACQ pid=5194 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:26.821859 sshd-session[5194]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 18:33:26.824417 kernel: audit: type=1103 audit(1769193206.817:811): pid=5194 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:26.825628 kernel: audit: type=1006 audit(1769193206.817:812): pid=5194 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=18 res=1 Jan 23 18:33:26.817000 audit[5194]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc85590ea0 a2=3 a3=0 items=0 ppid=1 pid=5194 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=18 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:33:26.828158 kernel: audit: type=1300 audit(1769193206.817:812): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc85590ea0 a2=3 a3=0 items=0 ppid=1 pid=5194 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=18 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:33:26.817000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 18:33:26.831534 kernel: audit: type=1327 audit(1769193206.817:812): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 18:33:26.838735 systemd-logind[1659]: New session 18 of user core. Jan 23 18:33:26.844560 systemd[1]: Started session-18.scope - Session 18 of User core. Jan 23 18:33:26.846000 audit[5194]: USER_START pid=5194 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:26.852299 kernel: audit: type=1105 audit(1769193206.846:813): pid=5194 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:26.848000 audit[5198]: CRED_ACQ pid=5198 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:26.859279 kernel: audit: type=1103 audit(1769193206.848:814): pid=5198 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:27.214293 sshd[5198]: Connection closed by 68.220.241.50 port 57640 Jan 23 18:33:27.214452 sshd-session[5194]: pam_unix(sshd:session): session closed for user core Jan 23 18:33:27.223299 kernel: audit: type=1106 audit(1769193207.217:815): pid=5194 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:27.217000 audit[5194]: USER_END pid=5194 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:27.221843 systemd[1]: sshd@16-10.0.9.101:22-68.220.241.50:57640.service: Deactivated successfully. Jan 23 18:33:27.223907 systemd[1]: session-18.scope: Deactivated successfully. Jan 23 18:33:27.217000 audit[5194]: CRED_DISP pid=5194 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:27.228571 systemd-logind[1659]: Session 18 logged out. Waiting for processes to exit. Jan 23 18:33:27.220000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-10.0.9.101:22-68.220.241.50:57640 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:33:27.229317 kernel: audit: type=1104 audit(1769193207.217:816): pid=5194 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:27.229538 systemd-logind[1659]: Removed session 18. Jan 23 18:33:27.321000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-10.0.9.101:22-68.220.241.50:57646 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:33:27.322533 systemd[1]: Started sshd@17-10.0.9.101:22-68.220.241.50:57646.service - OpenSSH per-connection server daemon (68.220.241.50:57646). Jan 23 18:33:27.377699 kubelet[2894]: E0123 18:33:27.377668 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-9xqc6" podUID="dad22ee7-a9d6-4858-9e53-0db48fecba12" Jan 23 18:33:27.379161 kubelet[2894]: E0123 18:33:27.379104 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-8447c9595f-2lbtc" podUID="be525a6b-06ca-4032-a777-c6e0f1c5eb71" Jan 23 18:33:27.854000 audit[5209]: USER_ACCT pid=5209 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:27.856335 sshd[5209]: Accepted publickey for core from 68.220.241.50 port 57646 ssh2: RSA SHA256:nKFqiC1tfI89CurykR2N2Qujx/39ZzIdQ7HqDt8w/Gw Jan 23 18:33:27.855000 audit[5209]: CRED_ACQ pid=5209 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:27.855000 audit[5209]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffcc5d32f50 a2=3 a3=0 items=0 ppid=1 pid=5209 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=19 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:33:27.855000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 18:33:27.857900 sshd-session[5209]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 18:33:27.862449 systemd-logind[1659]: New session 19 of user core. Jan 23 18:33:27.868423 systemd[1]: Started session-19.scope - Session 19 of User core. Jan 23 18:33:27.869000 audit[5209]: USER_START pid=5209 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:27.872000 audit[5213]: CRED_ACQ pid=5213 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:28.379146 kubelet[2894]: E0123 18:33:28.379073 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-779b7ffd49-pbrfh" podUID="7a5d89a2-c80e-4f56-8808-99252854603a" Jan 23 18:33:28.456376 sshd[5213]: Connection closed by 68.220.241.50 port 57646 Jan 23 18:33:28.458417 sshd-session[5209]: pam_unix(sshd:session): session closed for user core Jan 23 18:33:28.458000 audit[5209]: USER_END pid=5209 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:28.460000 audit[5209]: CRED_DISP pid=5209 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:28.462000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-10.0.9.101:22-68.220.241.50:57646 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:33:28.463782 systemd[1]: sshd@17-10.0.9.101:22-68.220.241.50:57646.service: Deactivated successfully. Jan 23 18:33:28.467232 systemd[1]: session-19.scope: Deactivated successfully. Jan 23 18:33:28.469566 systemd-logind[1659]: Session 19 logged out. Waiting for processes to exit. Jan 23 18:33:28.471904 systemd-logind[1659]: Removed session 19. Jan 23 18:33:28.562000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-10.0.9.101:22-68.220.241.50:57662 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:33:28.564055 systemd[1]: Started sshd@18-10.0.9.101:22-68.220.241.50:57662.service - OpenSSH per-connection server daemon (68.220.241.50:57662). Jan 23 18:33:29.086000 audit[5223]: USER_ACCT pid=5223 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:29.088490 sshd[5223]: Accepted publickey for core from 68.220.241.50 port 57662 ssh2: RSA SHA256:nKFqiC1tfI89CurykR2N2Qujx/39ZzIdQ7HqDt8w/Gw Jan 23 18:33:29.088000 audit[5223]: CRED_ACQ pid=5223 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:29.088000 audit[5223]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffeda5a2250 a2=3 a3=0 items=0 ppid=1 pid=5223 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=20 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:33:29.088000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 18:33:29.090083 sshd-session[5223]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 18:33:29.095294 systemd-logind[1659]: New session 20 of user core. Jan 23 18:33:29.100395 systemd[1]: Started session-20.scope - Session 20 of User core. Jan 23 18:33:29.101000 audit[5223]: USER_START pid=5223 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:29.103000 audit[5227]: CRED_ACQ pid=5227 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:29.864327 sshd[5227]: Connection closed by 68.220.241.50 port 57662 Jan 23 18:33:29.864893 sshd-session[5223]: pam_unix(sshd:session): session closed for user core Jan 23 18:33:29.867000 audit[5223]: USER_END pid=5223 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:29.867000 audit[5223]: CRED_DISP pid=5223 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:29.871619 systemd[1]: sshd@18-10.0.9.101:22-68.220.241.50:57662.service: Deactivated successfully. Jan 23 18:33:29.870000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-10.0.9.101:22-68.220.241.50:57662 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:33:29.875036 systemd[1]: session-20.scope: Deactivated successfully. Jan 23 18:33:29.876968 systemd-logind[1659]: Session 20 logged out. Waiting for processes to exit. Jan 23 18:33:29.879144 systemd-logind[1659]: Removed session 20. Jan 23 18:33:29.881000 audit[5245]: NETFILTER_CFG table=filter:141 family=2 entries=26 op=nft_register_rule pid=5245 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:33:29.881000 audit[5245]: SYSCALL arch=c000003e syscall=46 success=yes exit=14176 a0=3 a1=7ffd501a4920 a2=0 a3=7ffd501a490c items=0 ppid=3032 pid=5245 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:33:29.881000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:33:29.885000 audit[5245]: NETFILTER_CFG table=nat:142 family=2 entries=20 op=nft_register_rule pid=5245 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:33:29.885000 audit[5245]: SYSCALL arch=c000003e syscall=46 success=yes exit=5772 a0=3 a1=7ffd501a4920 a2=0 a3=0 items=0 ppid=3032 pid=5245 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:33:29.885000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:33:29.901000 audit[5247]: NETFILTER_CFG table=filter:143 family=2 entries=38 op=nft_register_rule pid=5247 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:33:29.901000 audit[5247]: SYSCALL arch=c000003e syscall=46 success=yes exit=14176 a0=3 a1=7fff922823c0 a2=0 a3=7fff922823ac items=0 ppid=3032 pid=5247 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:33:29.901000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:33:29.908000 audit[5247]: NETFILTER_CFG table=nat:144 family=2 entries=20 op=nft_register_rule pid=5247 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:33:29.908000 audit[5247]: SYSCALL arch=c000003e syscall=46 success=yes exit=5772 a0=3 a1=7fff922823c0 a2=0 a3=0 items=0 ppid=3032 pid=5247 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:33:29.908000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:33:29.974327 systemd[1]: Started sshd@19-10.0.9.101:22-68.220.241.50:57668.service - OpenSSH per-connection server daemon (68.220.241.50:57668). Jan 23 18:33:29.973000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-10.0.9.101:22-68.220.241.50:57668 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:33:30.492000 audit[5249]: USER_ACCT pid=5249 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:30.494216 sshd[5249]: Accepted publickey for core from 68.220.241.50 port 57668 ssh2: RSA SHA256:nKFqiC1tfI89CurykR2N2Qujx/39ZzIdQ7HqDt8w/Gw Jan 23 18:33:30.493000 audit[5249]: CRED_ACQ pid=5249 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:30.493000 audit[5249]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffd1b36a310 a2=3 a3=0 items=0 ppid=1 pid=5249 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=21 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:33:30.493000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 18:33:30.495912 sshd-session[5249]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 18:33:30.500108 systemd-logind[1659]: New session 21 of user core. Jan 23 18:33:30.504406 systemd[1]: Started session-21.scope - Session 21 of User core. Jan 23 18:33:30.506000 audit[5249]: USER_START pid=5249 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:30.507000 audit[5253]: CRED_ACQ pid=5253 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:30.944018 sshd[5253]: Connection closed by 68.220.241.50 port 57668 Jan 23 18:33:30.944513 sshd-session[5249]: pam_unix(sshd:session): session closed for user core Jan 23 18:33:30.946000 audit[5249]: USER_END pid=5249 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:30.946000 audit[5249]: CRED_DISP pid=5249 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:30.951058 systemd-logind[1659]: Session 21 logged out. Waiting for processes to exit. Jan 23 18:33:30.950000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-10.0.9.101:22-68.220.241.50:57668 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:33:30.951794 systemd[1]: sshd@19-10.0.9.101:22-68.220.241.50:57668.service: Deactivated successfully. Jan 23 18:33:30.955974 systemd[1]: session-21.scope: Deactivated successfully. Jan 23 18:33:30.958759 systemd-logind[1659]: Removed session 21. Jan 23 18:33:31.053000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-10.0.9.101:22-68.220.241.50:57670 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:33:31.054390 systemd[1]: Started sshd@20-10.0.9.101:22-68.220.241.50:57670.service - OpenSSH per-connection server daemon (68.220.241.50:57670). Jan 23 18:33:31.583000 audit[5263]: USER_ACCT pid=5263 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:31.585132 sshd[5263]: Accepted publickey for core from 68.220.241.50 port 57670 ssh2: RSA SHA256:nKFqiC1tfI89CurykR2N2Qujx/39ZzIdQ7HqDt8w/Gw Jan 23 18:33:31.585874 kernel: kauditd_printk_skb: 47 callbacks suppressed Jan 23 18:33:31.587207 kernel: audit: type=1101 audit(1769193211.583:850): pid=5263 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:31.589801 sshd-session[5263]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 18:33:31.586000 audit[5263]: CRED_ACQ pid=5263 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:31.592082 kernel: audit: type=1103 audit(1769193211.586:851): pid=5263 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:31.595401 kernel: audit: type=1006 audit(1769193211.586:852): pid=5263 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=22 res=1 Jan 23 18:33:31.586000 audit[5263]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffd4a3ed440 a2=3 a3=0 items=0 ppid=1 pid=5263 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=22 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:33:31.586000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 18:33:31.602375 systemd-logind[1659]: New session 22 of user core. Jan 23 18:33:31.604347 kernel: audit: type=1300 audit(1769193211.586:852): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffd4a3ed440 a2=3 a3=0 items=0 ppid=1 pid=5263 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=22 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:33:31.604403 kernel: audit: type=1327 audit(1769193211.586:852): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 18:33:31.606436 systemd[1]: Started session-22.scope - Session 22 of User core. Jan 23 18:33:31.614295 kernel: audit: type=1105 audit(1769193211.608:853): pid=5263 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:31.608000 audit[5263]: USER_START pid=5263 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:31.613000 audit[5267]: CRED_ACQ pid=5267 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:31.618285 kernel: audit: type=1103 audit(1769193211.613:854): pid=5267 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:31.942019 sshd[5267]: Connection closed by 68.220.241.50 port 57670 Jan 23 18:33:31.942783 sshd-session[5263]: pam_unix(sshd:session): session closed for user core Jan 23 18:33:31.944000 audit[5263]: USER_END pid=5263 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:31.948728 systemd[1]: sshd@20-10.0.9.101:22-68.220.241.50:57670.service: Deactivated successfully. Jan 23 18:33:31.950996 systemd[1]: session-22.scope: Deactivated successfully. Jan 23 18:33:31.951675 kernel: audit: type=1106 audit(1769193211.944:855): pid=5263 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:31.944000 audit[5263]: CRED_DISP pid=5263 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:31.955942 systemd-logind[1659]: Session 22 logged out. Waiting for processes to exit. Jan 23 18:33:31.948000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-10.0.9.101:22-68.220.241.50:57670 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:33:31.957526 kernel: audit: type=1104 audit(1769193211.944:856): pid=5263 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:31.957583 kernel: audit: type=1131 audit(1769193211.948:857): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-10.0.9.101:22-68.220.241.50:57670 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:33:31.958298 systemd-logind[1659]: Removed session 22. Jan 23 18:33:34.666000 audit[5278]: NETFILTER_CFG table=filter:145 family=2 entries=26 op=nft_register_rule pid=5278 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:33:34.666000 audit[5278]: SYSCALL arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7fff9fbe1ab0 a2=0 a3=7fff9fbe1a9c items=0 ppid=3032 pid=5278 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:33:34.666000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:33:34.672000 audit[5278]: NETFILTER_CFG table=nat:146 family=2 entries=104 op=nft_register_chain pid=5278 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:33:34.672000 audit[5278]: SYSCALL arch=c000003e syscall=46 success=yes exit=48684 a0=3 a1=7fff9fbe1ab0 a2=0 a3=7fff9fbe1a9c items=0 ppid=3032 pid=5278 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:33:34.672000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:33:36.377769 kubelet[2894]: E0123 18:33:36.377736 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-779b7ffd49-78hkk" podUID="644e9aff-fbfa-4d8d-bb83-94a0bb426243" Jan 23 18:33:36.379192 kubelet[2894]: E0123 18:33:36.379159 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-775b766696-sj9gw" podUID="a7bedb6c-04ad-4dfc-97e0-53467bd29e69" Jan 23 18:33:37.053000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-10.0.9.101:22-68.220.241.50:44222 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:33:37.054061 systemd[1]: Started sshd@21-10.0.9.101:22-68.220.241.50:44222.service - OpenSSH per-connection server daemon (68.220.241.50:44222). Jan 23 18:33:37.055619 kernel: kauditd_printk_skb: 6 callbacks suppressed Jan 23 18:33:37.055650 kernel: audit: type=1130 audit(1769193217.053:860): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-10.0.9.101:22-68.220.241.50:44222 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:33:37.378439 kubelet[2894]: E0123 18:33:37.378359 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-j9dvq" podUID="e681e1b7-9935-4d75-8509-9acd7616e3d8" Jan 23 18:33:37.573000 audit[5280]: USER_ACCT pid=5280 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:37.578123 sshd[5280]: Accepted publickey for core from 68.220.241.50 port 44222 ssh2: RSA SHA256:nKFqiC1tfI89CurykR2N2Qujx/39ZzIdQ7HqDt8w/Gw Jan 23 18:33:37.579363 kernel: audit: type=1101 audit(1769193217.573:861): pid=5280 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:37.579457 sshd-session[5280]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 18:33:37.575000 audit[5280]: CRED_ACQ pid=5280 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:37.584334 kernel: audit: type=1103 audit(1769193217.575:862): pid=5280 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:37.591475 kernel: audit: type=1006 audit(1769193217.575:863): pid=5280 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=23 res=1 Jan 23 18:33:37.575000 audit[5280]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc1e2a17d0 a2=3 a3=0 items=0 ppid=1 pid=5280 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=23 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:33:37.575000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 18:33:37.599392 systemd-logind[1659]: New session 23 of user core. Jan 23 18:33:37.600939 kernel: audit: type=1300 audit(1769193217.575:863): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc1e2a17d0 a2=3 a3=0 items=0 ppid=1 pid=5280 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=23 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:33:37.601007 kernel: audit: type=1327 audit(1769193217.575:863): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 18:33:37.601449 systemd[1]: Started session-23.scope - Session 23 of User core. Jan 23 18:33:37.610338 kernel: audit: type=1105 audit(1769193217.603:864): pid=5280 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:37.603000 audit[5280]: USER_START pid=5280 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:37.610000 audit[5286]: CRED_ACQ pid=5286 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:37.615287 kernel: audit: type=1103 audit(1769193217.610:865): pid=5286 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:37.947820 sshd[5286]: Connection closed by 68.220.241.50 port 44222 Jan 23 18:33:37.949835 sshd-session[5280]: pam_unix(sshd:session): session closed for user core Jan 23 18:33:37.950000 audit[5280]: USER_END pid=5280 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:37.952000 audit[5280]: CRED_DISP pid=5280 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:37.958150 kernel: audit: type=1106 audit(1769193217.950:866): pid=5280 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:37.958183 kernel: audit: type=1104 audit(1769193217.952:867): pid=5280 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:37.959555 systemd[1]: sshd@21-10.0.9.101:22-68.220.241.50:44222.service: Deactivated successfully. Jan 23 18:33:37.957000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-10.0.9.101:22-68.220.241.50:44222 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:33:37.962397 systemd[1]: session-23.scope: Deactivated successfully. Jan 23 18:33:37.963950 systemd-logind[1659]: Session 23 logged out. Waiting for processes to exit. Jan 23 18:33:37.966468 systemd-logind[1659]: Removed session 23. Jan 23 18:33:40.376852 kubelet[2894]: E0123 18:33:40.376802 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-8447c9595f-2lbtc" podUID="be525a6b-06ca-4032-a777-c6e0f1c5eb71" Jan 23 18:33:40.378063 kubelet[2894]: E0123 18:33:40.378036 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-9xqc6" podUID="dad22ee7-a9d6-4858-9e53-0db48fecba12" Jan 23 18:33:42.377272 kubelet[2894]: E0123 18:33:42.377110 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-779b7ffd49-pbrfh" podUID="7a5d89a2-c80e-4f56-8808-99252854603a" Jan 23 18:33:43.059000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-10.0.9.101:22-68.220.241.50:47626 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:33:43.059435 systemd[1]: Started sshd@22-10.0.9.101:22-68.220.241.50:47626.service - OpenSSH per-connection server daemon (68.220.241.50:47626). Jan 23 18:33:43.060850 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 23 18:33:43.060886 kernel: audit: type=1130 audit(1769193223.059:869): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-10.0.9.101:22-68.220.241.50:47626 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:33:43.574000 audit[5298]: USER_ACCT pid=5298 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:43.575132 sshd[5298]: Accepted publickey for core from 68.220.241.50 port 47626 ssh2: RSA SHA256:nKFqiC1tfI89CurykR2N2Qujx/39ZzIdQ7HqDt8w/Gw Jan 23 18:33:43.578000 audit[5298]: CRED_ACQ pid=5298 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:43.579345 sshd-session[5298]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 18:33:43.580121 kernel: audit: type=1101 audit(1769193223.574:870): pid=5298 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:43.580169 kernel: audit: type=1103 audit(1769193223.578:871): pid=5298 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:43.578000 audit[5298]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffcd4e9b0a0 a2=3 a3=0 items=0 ppid=1 pid=5298 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=24 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:33:43.587029 kernel: audit: type=1006 audit(1769193223.578:872): pid=5298 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=24 res=1 Jan 23 18:33:43.587082 kernel: audit: type=1300 audit(1769193223.578:872): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffcd4e9b0a0 a2=3 a3=0 items=0 ppid=1 pid=5298 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=24 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:33:43.588080 systemd-logind[1659]: New session 24 of user core. Jan 23 18:33:43.578000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 18:33:43.591003 kernel: audit: type=1327 audit(1769193223.578:872): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 18:33:43.598447 systemd[1]: Started session-24.scope - Session 24 of User core. Jan 23 18:33:43.601000 audit[5298]: USER_START pid=5298 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:43.607326 kernel: audit: type=1105 audit(1769193223.601:873): pid=5298 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:43.607000 audit[5302]: CRED_ACQ pid=5302 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:43.611279 kernel: audit: type=1103 audit(1769193223.607:874): pid=5302 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:43.968275 sshd[5302]: Connection closed by 68.220.241.50 port 47626 Jan 23 18:33:43.968728 sshd-session[5298]: pam_unix(sshd:session): session closed for user core Jan 23 18:33:43.969000 audit[5298]: USER_END pid=5298 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:43.972253 systemd[1]: sshd@22-10.0.9.101:22-68.220.241.50:47626.service: Deactivated successfully. Jan 23 18:33:43.974351 systemd[1]: session-24.scope: Deactivated successfully. Jan 23 18:33:43.975288 kernel: audit: type=1106 audit(1769193223.969:875): pid=5298 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:43.975189 systemd-logind[1659]: Session 24 logged out. Waiting for processes to exit. Jan 23 18:33:43.977647 systemd-logind[1659]: Removed session 24. Jan 23 18:33:43.969000 audit[5298]: CRED_DISP pid=5298 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:43.982277 kernel: audit: type=1104 audit(1769193223.969:876): pid=5298 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:43.972000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-10.0.9.101:22-68.220.241.50:47626 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:33:49.073611 systemd[1]: Started sshd@23-10.0.9.101:22-68.220.241.50:47642.service - OpenSSH per-connection server daemon (68.220.241.50:47642). Jan 23 18:33:49.075876 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 23 18:33:49.075917 kernel: audit: type=1130 audit(1769193229.073:878): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-10.0.9.101:22-68.220.241.50:47642 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:33:49.073000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-10.0.9.101:22-68.220.241.50:47642 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:33:49.378662 kubelet[2894]: E0123 18:33:49.378133 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-779b7ffd49-78hkk" podUID="644e9aff-fbfa-4d8d-bb83-94a0bb426243" Jan 23 18:33:49.380478 kubelet[2894]: E0123 18:33:49.380416 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-775b766696-sj9gw" podUID="a7bedb6c-04ad-4dfc-97e0-53467bd29e69" Jan 23 18:33:49.594000 audit[5316]: USER_ACCT pid=5316 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:49.595429 sshd[5316]: Accepted publickey for core from 68.220.241.50 port 47642 ssh2: RSA SHA256:nKFqiC1tfI89CurykR2N2Qujx/39ZzIdQ7HqDt8w/Gw Jan 23 18:33:49.597255 sshd-session[5316]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 18:33:49.594000 audit[5316]: CRED_ACQ pid=5316 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:49.601016 kernel: audit: type=1101 audit(1769193229.594:879): pid=5316 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:49.601066 kernel: audit: type=1103 audit(1769193229.594:880): pid=5316 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:49.594000 audit[5316]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc17ac4170 a2=3 a3=0 items=0 ppid=1 pid=5316 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=25 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:33:49.608016 kernel: audit: type=1006 audit(1769193229.594:881): pid=5316 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=25 res=1 Jan 23 18:33:49.608062 kernel: audit: type=1300 audit(1769193229.594:881): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc17ac4170 a2=3 a3=0 items=0 ppid=1 pid=5316 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=25 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:33:49.594000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 18:33:49.611748 kernel: audit: type=1327 audit(1769193229.594:881): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 18:33:49.613301 systemd-logind[1659]: New session 25 of user core. Jan 23 18:33:49.618430 systemd[1]: Started session-25.scope - Session 25 of User core. Jan 23 18:33:49.621000 audit[5316]: USER_START pid=5316 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:49.623000 audit[5320]: CRED_ACQ pid=5320 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:49.626407 kernel: audit: type=1105 audit(1769193229.621:882): pid=5316 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:49.626447 kernel: audit: type=1103 audit(1769193229.623:883): pid=5320 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:49.949297 sshd[5320]: Connection closed by 68.220.241.50 port 47642 Jan 23 18:33:49.950300 sshd-session[5316]: pam_unix(sshd:session): session closed for user core Jan 23 18:33:49.950000 audit[5316]: USER_END pid=5316 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:49.953694 systemd[1]: sshd@23-10.0.9.101:22-68.220.241.50:47642.service: Deactivated successfully. Jan 23 18:33:49.955703 systemd[1]: session-25.scope: Deactivated successfully. Jan 23 18:33:49.959015 kernel: audit: type=1106 audit(1769193229.950:884): pid=5316 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:49.959076 kernel: audit: type=1104 audit(1769193229.951:885): pid=5316 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:49.951000 audit[5316]: CRED_DISP pid=5316 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:49.957064 systemd-logind[1659]: Session 25 logged out. Waiting for processes to exit. Jan 23 18:33:49.958381 systemd-logind[1659]: Removed session 25. Jan 23 18:33:49.953000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-10.0.9.101:22-68.220.241.50:47642 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:33:50.379445 kubelet[2894]: E0123 18:33:50.379405 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-j9dvq" podUID="e681e1b7-9935-4d75-8509-9acd7616e3d8" Jan 23 18:33:51.379325 kubelet[2894]: E0123 18:33:51.379053 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-9xqc6" podUID="dad22ee7-a9d6-4858-9e53-0db48fecba12" Jan 23 18:33:51.380354 kubelet[2894]: E0123 18:33:51.380329 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-8447c9595f-2lbtc" podUID="be525a6b-06ca-4032-a777-c6e0f1c5eb71" Jan 23 18:33:53.378241 kubelet[2894]: E0123 18:33:53.378199 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-779b7ffd49-pbrfh" podUID="7a5d89a2-c80e-4f56-8808-99252854603a" Jan 23 18:33:55.054000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@24-10.0.9.101:22-68.220.241.50:40484 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:33:55.054548 systemd[1]: Started sshd@24-10.0.9.101:22-68.220.241.50:40484.service - OpenSSH per-connection server daemon (68.220.241.50:40484). Jan 23 18:33:55.055701 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 23 18:33:55.055748 kernel: audit: type=1130 audit(1769193235.054:887): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@24-10.0.9.101:22-68.220.241.50:40484 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:33:55.577000 audit[5354]: USER_ACCT pid=5354 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:55.578210 sshd[5354]: Accepted publickey for core from 68.220.241.50 port 40484 ssh2: RSA SHA256:nKFqiC1tfI89CurykR2N2Qujx/39ZzIdQ7HqDt8w/Gw Jan 23 18:33:55.579513 sshd-session[5354]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 18:33:55.578000 audit[5354]: CRED_ACQ pid=5354 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:55.583468 kernel: audit: type=1101 audit(1769193235.577:888): pid=5354 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:55.583520 kernel: audit: type=1103 audit(1769193235.578:889): pid=5354 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:55.587391 kernel: audit: type=1006 audit(1769193235.578:890): pid=5354 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=26 res=1 Jan 23 18:33:55.578000 audit[5354]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe7d31e170 a2=3 a3=0 items=0 ppid=1 pid=5354 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=26 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:33:55.590509 kernel: audit: type=1300 audit(1769193235.578:890): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe7d31e170 a2=3 a3=0 items=0 ppid=1 pid=5354 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=26 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:33:55.578000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 18:33:55.593199 systemd-logind[1659]: New session 26 of user core. Jan 23 18:33:55.594366 kernel: audit: type=1327 audit(1769193235.578:890): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 18:33:55.598806 systemd[1]: Started session-26.scope - Session 26 of User core. Jan 23 18:33:55.603000 audit[5354]: USER_START pid=5354 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:55.607000 audit[5358]: CRED_ACQ pid=5358 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:55.610945 kernel: audit: type=1105 audit(1769193235.603:891): pid=5354 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:55.611008 kernel: audit: type=1103 audit(1769193235.607:892): pid=5358 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:55.948669 sshd[5358]: Connection closed by 68.220.241.50 port 40484 Jan 23 18:33:55.949445 sshd-session[5354]: pam_unix(sshd:session): session closed for user core Jan 23 18:33:55.951000 audit[5354]: USER_END pid=5354 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:55.954576 systemd[1]: sshd@24-10.0.9.101:22-68.220.241.50:40484.service: Deactivated successfully. Jan 23 18:33:55.954891 systemd-logind[1659]: Session 26 logged out. Waiting for processes to exit. Jan 23 18:33:55.951000 audit[5354]: CRED_DISP pid=5354 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:55.957746 systemd[1]: session-26.scope: Deactivated successfully. Jan 23 18:33:55.957996 kernel: audit: type=1106 audit(1769193235.951:893): pid=5354 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:55.958033 kernel: audit: type=1104 audit(1769193235.951:894): pid=5354 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:33:55.954000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@24-10.0.9.101:22-68.220.241.50:40484 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:33:55.960756 systemd-logind[1659]: Removed session 26. Jan 23 18:34:01.055280 systemd[1]: Started sshd@25-10.0.9.101:22-68.220.241.50:40494.service - OpenSSH per-connection server daemon (68.220.241.50:40494). Jan 23 18:34:01.055000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@25-10.0.9.101:22-68.220.241.50:40494 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:34:01.057466 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 23 18:34:01.057513 kernel: audit: type=1130 audit(1769193241.055:896): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@25-10.0.9.101:22-68.220.241.50:40494 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:34:01.575000 audit[5370]: USER_ACCT pid=5370 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:34:01.581311 kernel: audit: type=1101 audit(1769193241.575:897): pid=5370 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:34:01.581404 sshd[5370]: Accepted publickey for core from 68.220.241.50 port 40494 ssh2: RSA SHA256:nKFqiC1tfI89CurykR2N2Qujx/39ZzIdQ7HqDt8w/Gw Jan 23 18:34:01.584706 sshd-session[5370]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 18:34:01.581000 audit[5370]: CRED_ACQ pid=5370 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:34:01.589338 kernel: audit: type=1103 audit(1769193241.581:898): pid=5370 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:34:01.598341 kernel: audit: type=1006 audit(1769193241.581:899): pid=5370 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=27 res=1 Jan 23 18:34:01.599072 systemd-logind[1659]: New session 27 of user core. Jan 23 18:34:01.601435 systemd[1]: Started session-27.scope - Session 27 of User core. Jan 23 18:34:01.607325 kernel: audit: type=1300 audit(1769193241.581:899): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc705265c0 a2=3 a3=0 items=0 ppid=1 pid=5370 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=27 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:34:01.581000 audit[5370]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc705265c0 a2=3 a3=0 items=0 ppid=1 pid=5370 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=27 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:34:01.613545 kernel: audit: type=1327 audit(1769193241.581:899): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 18:34:01.581000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 18:34:01.612000 audit[5370]: USER_START pid=5370 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:34:01.620720 kernel: audit: type=1105 audit(1769193241.612:900): pid=5370 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:34:01.614000 audit[5374]: CRED_ACQ pid=5374 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:34:01.627274 kernel: audit: type=1103 audit(1769193241.614:901): pid=5374 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:34:01.953003 sshd[5374]: Connection closed by 68.220.241.50 port 40494 Jan 23 18:34:01.953551 sshd-session[5370]: pam_unix(sshd:session): session closed for user core Jan 23 18:34:01.955000 audit[5370]: USER_END pid=5370 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:34:01.958000 audit[5370]: CRED_DISP pid=5370 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:34:01.961992 kernel: audit: type=1106 audit(1769193241.955:902): pid=5370 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:34:01.962054 kernel: audit: type=1104 audit(1769193241.958:903): pid=5370 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 18:34:01.964301 systemd[1]: sshd@25-10.0.9.101:22-68.220.241.50:40494.service: Deactivated successfully. Jan 23 18:34:01.965916 systemd[1]: session-27.scope: Deactivated successfully. Jan 23 18:34:01.964000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@25-10.0.9.101:22-68.220.241.50:40494 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:34:01.970766 systemd-logind[1659]: Session 27 logged out. Waiting for processes to exit. Jan 23 18:34:01.971798 systemd-logind[1659]: Removed session 27. Jan 23 18:34:02.377420 kubelet[2894]: E0123 18:34:02.377379 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-779b7ffd49-78hkk" podUID="644e9aff-fbfa-4d8d-bb83-94a0bb426243" Jan 23 18:34:02.378182 kubelet[2894]: E0123 18:34:02.377675 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-8447c9595f-2lbtc" podUID="be525a6b-06ca-4032-a777-c6e0f1c5eb71" Jan 23 18:34:02.378888 kubelet[2894]: E0123 18:34:02.378843 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-775b766696-sj9gw" podUID="a7bedb6c-04ad-4dfc-97e0-53467bd29e69" Jan 23 18:34:04.377951 kubelet[2894]: E0123 18:34:04.377556 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-779b7ffd49-pbrfh" podUID="7a5d89a2-c80e-4f56-8808-99252854603a" Jan 23 18:34:05.379477 kubelet[2894]: E0123 18:34:05.379444 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-j9dvq" podUID="e681e1b7-9935-4d75-8509-9acd7616e3d8" Jan 23 18:34:06.377903 kubelet[2894]: E0123 18:34:06.377592 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-9xqc6" podUID="dad22ee7-a9d6-4858-9e53-0db48fecba12" Jan 23 18:34:13.379210 kubelet[2894]: E0123 18:34:13.379153 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-779b7ffd49-78hkk" podUID="644e9aff-fbfa-4d8d-bb83-94a0bb426243" Jan 23 18:34:15.380628 containerd[1690]: time="2026-01-23T18:34:15.380363452Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 23 18:34:15.381546 kubelet[2894]: E0123 18:34:15.381274 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-8447c9595f-2lbtc" podUID="be525a6b-06ca-4032-a777-c6e0f1c5eb71" Jan 23 18:34:15.703786 containerd[1690]: time="2026-01-23T18:34:15.703542437Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 18:34:15.704998 containerd[1690]: time="2026-01-23T18:34:15.704941433Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 23 18:34:15.705140 containerd[1690]: time="2026-01-23T18:34:15.705020660Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 23 18:34:15.705178 kubelet[2894]: E0123 18:34:15.705134 2894 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 23 18:34:15.705178 kubelet[2894]: E0123 18:34:15.705170 2894 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 23 18:34:15.705320 kubelet[2894]: E0123 18:34:15.705282 2894 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:9dc376b3bb22419f9d12e1d73f9667ce,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qvjpk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-775b766696-sj9gw_calico-system(a7bedb6c-04ad-4dfc-97e0-53467bd29e69): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 23 18:34:15.707211 containerd[1690]: time="2026-01-23T18:34:15.707188089Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 23 18:34:16.042172 containerd[1690]: time="2026-01-23T18:34:16.042057066Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 18:34:16.044169 containerd[1690]: time="2026-01-23T18:34:16.044096597Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 23 18:34:16.044240 containerd[1690]: time="2026-01-23T18:34:16.044161063Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 23 18:34:16.044481 kubelet[2894]: E0123 18:34:16.044408 2894 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 23 18:34:16.044481 kubelet[2894]: E0123 18:34:16.044466 2894 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 23 18:34:16.044902 kubelet[2894]: E0123 18:34:16.044675 2894 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qvjpk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-775b766696-sj9gw_calico-system(a7bedb6c-04ad-4dfc-97e0-53467bd29e69): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 23 18:34:16.046139 kubelet[2894]: E0123 18:34:16.046093 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-775b766696-sj9gw" podUID="a7bedb6c-04ad-4dfc-97e0-53467bd29e69" Jan 23 18:34:16.377726 containerd[1690]: time="2026-01-23T18:34:16.377666203Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 23 18:34:16.735227 containerd[1690]: time="2026-01-23T18:34:16.735047514Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 18:34:16.737095 containerd[1690]: time="2026-01-23T18:34:16.736994605Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 23 18:34:16.737095 containerd[1690]: time="2026-01-23T18:34:16.737068521Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 23 18:34:16.737326 kubelet[2894]: E0123 18:34:16.737298 2894 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 18:34:16.737701 kubelet[2894]: E0123 18:34:16.737571 2894 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 18:34:16.737782 kubelet[2894]: E0123 18:34:16.737750 2894 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tn7s7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-779b7ffd49-pbrfh_calico-apiserver(7a5d89a2-c80e-4f56-8808-99252854603a): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 23 18:34:16.738890 kubelet[2894]: E0123 18:34:16.738865 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-779b7ffd49-pbrfh" podUID="7a5d89a2-c80e-4f56-8808-99252854603a" Jan 23 18:34:20.377299 containerd[1690]: time="2026-01-23T18:34:20.377243689Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 23 18:34:20.735275 containerd[1690]: time="2026-01-23T18:34:20.735167601Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 18:34:20.736648 containerd[1690]: time="2026-01-23T18:34:20.736621893Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 23 18:34:20.736696 containerd[1690]: time="2026-01-23T18:34:20.736687644Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 23 18:34:20.736835 kubelet[2894]: E0123 18:34:20.736811 2894 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 23 18:34:20.737229 kubelet[2894]: E0123 18:34:20.737094 2894 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 23 18:34:20.737229 kubelet[2894]: E0123 18:34:20.737196 2894 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cprcl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-j9dvq_calico-system(e681e1b7-9935-4d75-8509-9acd7616e3d8): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 23 18:34:20.738888 containerd[1690]: time="2026-01-23T18:34:20.738849189Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 23 18:34:21.073060 containerd[1690]: time="2026-01-23T18:34:21.072934521Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 18:34:21.074167 containerd[1690]: time="2026-01-23T18:34:21.074074894Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 23 18:34:21.074167 containerd[1690]: time="2026-01-23T18:34:21.074148467Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 23 18:34:21.074317 kubelet[2894]: E0123 18:34:21.074284 2894 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 23 18:34:21.074358 kubelet[2894]: E0123 18:34:21.074326 2894 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 23 18:34:21.074446 kubelet[2894]: E0123 18:34:21.074417 2894 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cprcl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-j9dvq_calico-system(e681e1b7-9935-4d75-8509-9acd7616e3d8): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 23 18:34:21.075562 kubelet[2894]: E0123 18:34:21.075534 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-j9dvq" podUID="e681e1b7-9935-4d75-8509-9acd7616e3d8" Jan 23 18:34:21.378676 containerd[1690]: time="2026-01-23T18:34:21.378434129Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 23 18:34:21.711373 containerd[1690]: time="2026-01-23T18:34:21.711248435Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 18:34:21.712440 containerd[1690]: time="2026-01-23T18:34:21.712412250Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 23 18:34:21.712543 containerd[1690]: time="2026-01-23T18:34:21.712416388Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 23 18:34:21.712612 kubelet[2894]: E0123 18:34:21.712585 2894 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 23 18:34:21.712672 kubelet[2894]: E0123 18:34:21.712622 2894 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 23 18:34:21.712770 kubelet[2894]: E0123 18:34:21.712733 2894 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mzml4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-9xqc6_calico-system(dad22ee7-a9d6-4858-9e53-0db48fecba12): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 23 18:34:21.713894 kubelet[2894]: E0123 18:34:21.713865 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-9xqc6" podUID="dad22ee7-a9d6-4858-9e53-0db48fecba12" Jan 23 18:34:26.823299 kubelet[2894]: E0123 18:34:26.823136 2894 event.go:359] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.9.101:58652->10.0.9.65:2379: read: connection timed out" event="&Event{ObjectMeta:{calico-apiserver-779b7ffd49-pbrfh.188d6fbd9255136b calico-apiserver 1303 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:calico-apiserver,Name:calico-apiserver-779b7ffd49-pbrfh,UID:7a5d89a2-c80e-4f56-8808-99252854603a,APIVersion:v1,ResourceVersion:798,FieldPath:spec.containers{calico-apiserver},},Reason:Pulling,Message:Pulling image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\",Source:EventSource{Component:kubelet,Host:ci-4547-1-0-2-32611d5cc2,},FirstTimestamp:2026-01-23 18:31:28 +0000 UTC,LastTimestamp:2026-01-23 18:34:16.377302592 +0000 UTC m=+219.084638137,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4547-1-0-2-32611d5cc2,}" Jan 23 18:34:27.328010 systemd[1]: cri-containerd-1602eaff6cd73e1631bf9dd9f7e00d690e7452702bb3d7c599059bcadf4e9dc0.scope: Deactivated successfully. Jan 23 18:34:27.328801 systemd[1]: cri-containerd-1602eaff6cd73e1631bf9dd9f7e00d690e7452702bb3d7c599059bcadf4e9dc0.scope: Consumed 23.722s CPU time, 118.4M memory peak, 680K read from disk. Jan 23 18:34:27.330067 containerd[1690]: time="2026-01-23T18:34:27.330041083Z" level=info msg="received container exit event container_id:\"1602eaff6cd73e1631bf9dd9f7e00d690e7452702bb3d7c599059bcadf4e9dc0\" id:\"1602eaff6cd73e1631bf9dd9f7e00d690e7452702bb3d7c599059bcadf4e9dc0\" pid:3229 exit_status:1 exited_at:{seconds:1769193267 nanos:329360994}" Jan 23 18:34:27.332000 audit: BPF prog-id=146 op=UNLOAD Jan 23 18:34:27.334276 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 23 18:34:27.334336 kernel: audit: type=1334 audit(1769193267.332:905): prog-id=146 op=UNLOAD Jan 23 18:34:27.332000 audit: BPF prog-id=150 op=UNLOAD Jan 23 18:34:27.336913 kernel: audit: type=1334 audit(1769193267.332:906): prog-id=150 op=UNLOAD Jan 23 18:34:27.352613 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-1602eaff6cd73e1631bf9dd9f7e00d690e7452702bb3d7c599059bcadf4e9dc0-rootfs.mount: Deactivated successfully. Jan 23 18:34:27.378368 containerd[1690]: time="2026-01-23T18:34:27.378342474Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 23 18:34:27.722212 containerd[1690]: time="2026-01-23T18:34:27.722062116Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 18:34:27.723511 containerd[1690]: time="2026-01-23T18:34:27.723452867Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 23 18:34:27.723762 containerd[1690]: time="2026-01-23T18:34:27.723488825Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 23 18:34:27.723814 kubelet[2894]: E0123 18:34:27.723774 2894 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 18:34:27.723932 kubelet[2894]: E0123 18:34:27.723818 2894 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 18:34:27.723958 kubelet[2894]: E0123 18:34:27.723932 2894 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cnptr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-779b7ffd49-78hkk_calico-apiserver(644e9aff-fbfa-4d8d-bb83-94a0bb426243): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 23 18:34:27.725394 kubelet[2894]: E0123 18:34:27.725374 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-779b7ffd49-78hkk" podUID="644e9aff-fbfa-4d8d-bb83-94a0bb426243" Jan 23 18:34:27.783778 kubelet[2894]: E0123 18:34:27.783744 2894 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.9.101:58864->10.0.9.65:2379: read: connection timed out" Jan 23 18:34:28.183082 kubelet[2894]: I0123 18:34:28.183060 2894 scope.go:117] "RemoveContainer" containerID="1602eaff6cd73e1631bf9dd9f7e00d690e7452702bb3d7c599059bcadf4e9dc0" Jan 23 18:34:28.185206 containerd[1690]: time="2026-01-23T18:34:28.184978723Z" level=info msg="CreateContainer within sandbox \"4c284b7020e8efa8eaa227d3bb2bba2873ff0ae88e46d26f23ef1e73a20f3b12\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Jan 23 18:34:28.196632 containerd[1690]: time="2026-01-23T18:34:28.196609512Z" level=info msg="Container 001c5f3fe41dab572d50216a08fe8398bc609310bb98b24615d02d4aec4e421c: CDI devices from CRI Config.CDIDevices: []" Jan 23 18:34:28.203415 containerd[1690]: time="2026-01-23T18:34:28.203391743Z" level=info msg="CreateContainer within sandbox \"4c284b7020e8efa8eaa227d3bb2bba2873ff0ae88e46d26f23ef1e73a20f3b12\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"001c5f3fe41dab572d50216a08fe8398bc609310bb98b24615d02d4aec4e421c\"" Jan 23 18:34:28.204284 containerd[1690]: time="2026-01-23T18:34:28.203778372Z" level=info msg="StartContainer for \"001c5f3fe41dab572d50216a08fe8398bc609310bb98b24615d02d4aec4e421c\"" Jan 23 18:34:28.204504 containerd[1690]: time="2026-01-23T18:34:28.204477227Z" level=info msg="connecting to shim 001c5f3fe41dab572d50216a08fe8398bc609310bb98b24615d02d4aec4e421c" address="unix:///run/containerd/s/bcead1df0e8fa871db231778207e965752e21fe3e09f27e6bbb0a6a614b1dbf9" protocol=ttrpc version=3 Jan 23 18:34:28.223410 systemd[1]: Started cri-containerd-001c5f3fe41dab572d50216a08fe8398bc609310bb98b24615d02d4aec4e421c.scope - libcontainer container 001c5f3fe41dab572d50216a08fe8398bc609310bb98b24615d02d4aec4e421c. Jan 23 18:34:28.232000 audit: BPF prog-id=256 op=LOAD Jan 23 18:34:28.232000 audit: BPF prog-id=257 op=LOAD Jan 23 18:34:28.235621 kernel: audit: type=1334 audit(1769193268.232:907): prog-id=256 op=LOAD Jan 23 18:34:28.235664 kernel: audit: type=1334 audit(1769193268.232:908): prog-id=257 op=LOAD Jan 23 18:34:28.236501 kernel: audit: type=1300 audit(1769193268.232:908): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a0238 a2=98 a3=0 items=0 ppid=3022 pid=5431 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:34:28.232000 audit[5431]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a0238 a2=98 a3=0 items=0 ppid=3022 pid=5431 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:34:28.232000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3030316335663366653431646162353732643530323136613038666538 Jan 23 18:34:28.242424 kernel: audit: type=1327 audit(1769193268.232:908): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3030316335663366653431646162353732643530323136613038666538 Jan 23 18:34:28.232000 audit: BPF prog-id=257 op=UNLOAD Jan 23 18:34:28.245390 kernel: audit: type=1334 audit(1769193268.232:909): prog-id=257 op=UNLOAD Jan 23 18:34:28.232000 audit[5431]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3022 pid=5431 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:34:28.247836 kernel: audit: type=1300 audit(1769193268.232:909): arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3022 pid=5431 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:34:28.232000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3030316335663366653431646162353732643530323136613038666538 Jan 23 18:34:28.232000 audit: BPF prog-id=258 op=LOAD Jan 23 18:34:28.255501 kernel: audit: type=1327 audit(1769193268.232:909): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3030316335663366653431646162353732643530323136613038666538 Jan 23 18:34:28.255928 kernel: audit: type=1334 audit(1769193268.232:910): prog-id=258 op=LOAD Jan 23 18:34:28.232000 audit[5431]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a0488 a2=98 a3=0 items=0 ppid=3022 pid=5431 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:34:28.232000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3030316335663366653431646162353732643530323136613038666538 Jan 23 18:34:28.232000 audit: BPF prog-id=259 op=LOAD Jan 23 18:34:28.232000 audit[5431]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c0001a0218 a2=98 a3=0 items=0 ppid=3022 pid=5431 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:34:28.232000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3030316335663366653431646162353732643530323136613038666538 Jan 23 18:34:28.232000 audit: BPF prog-id=259 op=UNLOAD Jan 23 18:34:28.232000 audit[5431]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3022 pid=5431 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:34:28.232000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3030316335663366653431646162353732643530323136613038666538 Jan 23 18:34:28.232000 audit: BPF prog-id=258 op=UNLOAD Jan 23 18:34:28.232000 audit[5431]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3022 pid=5431 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:34:28.232000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3030316335663366653431646162353732643530323136613038666538 Jan 23 18:34:28.232000 audit: BPF prog-id=260 op=LOAD Jan 23 18:34:28.232000 audit[5431]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a06e8 a2=98 a3=0 items=0 ppid=3022 pid=5431 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:34:28.232000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3030316335663366653431646162353732643530323136613038666538 Jan 23 18:34:28.267643 containerd[1690]: time="2026-01-23T18:34:28.267598766Z" level=info msg="StartContainer for \"001c5f3fe41dab572d50216a08fe8398bc609310bb98b24615d02d4aec4e421c\" returns successfully" Jan 23 18:34:28.295362 systemd[1]: cri-containerd-24bb17e21eae28d5ce491e01f7f9a24ca937cd79552e2142db76a4ec55565440.scope: Deactivated successfully. Jan 23 18:34:28.295638 systemd[1]: cri-containerd-24bb17e21eae28d5ce491e01f7f9a24ca937cd79552e2142db76a4ec55565440.scope: Consumed 3.428s CPU time, 55.4M memory peak. Jan 23 18:34:28.295000 audit: BPF prog-id=261 op=LOAD Jan 23 18:34:28.295000 audit: BPF prog-id=93 op=UNLOAD Jan 23 18:34:28.297000 audit: BPF prog-id=108 op=UNLOAD Jan 23 18:34:28.297000 audit: BPF prog-id=112 op=UNLOAD Jan 23 18:34:28.299091 containerd[1690]: time="2026-01-23T18:34:28.299067946Z" level=info msg="received container exit event container_id:\"24bb17e21eae28d5ce491e01f7f9a24ca937cd79552e2142db76a4ec55565440\" id:\"24bb17e21eae28d5ce491e01f7f9a24ca937cd79552e2142db76a4ec55565440\" pid:2751 exit_status:1 exited_at:{seconds:1769193268 nanos:295049760}" Jan 23 18:34:28.335494 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-24bb17e21eae28d5ce491e01f7f9a24ca937cd79552e2142db76a4ec55565440-rootfs.mount: Deactivated successfully. Jan 23 18:34:29.186276 kubelet[2894]: I0123 18:34:29.185867 2894 scope.go:117] "RemoveContainer" containerID="24bb17e21eae28d5ce491e01f7f9a24ca937cd79552e2142db76a4ec55565440" Jan 23 18:34:29.188099 containerd[1690]: time="2026-01-23T18:34:29.187749404Z" level=info msg="CreateContainer within sandbox \"d390d3daf79ae4c427b44f6e322e213d343ddd3650025cbc1967d0ad3789a89f\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Jan 23 18:34:29.200274 containerd[1690]: time="2026-01-23T18:34:29.200239835Z" level=info msg="Container bd42997247e2ee3e1a3eec1de21415f63214e9abeb1aee51d0ca4b534a418130: CDI devices from CRI Config.CDIDevices: []" Jan 23 18:34:29.200929 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3996694436.mount: Deactivated successfully. Jan 23 18:34:29.210845 containerd[1690]: time="2026-01-23T18:34:29.210814986Z" level=info msg="CreateContainer within sandbox \"d390d3daf79ae4c427b44f6e322e213d343ddd3650025cbc1967d0ad3789a89f\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"bd42997247e2ee3e1a3eec1de21415f63214e9abeb1aee51d0ca4b534a418130\"" Jan 23 18:34:29.211344 containerd[1690]: time="2026-01-23T18:34:29.211250630Z" level=info msg="StartContainer for \"bd42997247e2ee3e1a3eec1de21415f63214e9abeb1aee51d0ca4b534a418130\"" Jan 23 18:34:29.212232 containerd[1690]: time="2026-01-23T18:34:29.212189301Z" level=info msg="connecting to shim bd42997247e2ee3e1a3eec1de21415f63214e9abeb1aee51d0ca4b534a418130" address="unix:///run/containerd/s/670e9b1f02ae640eb538a016c40a3b185f4d90182f3951b43f5d3c7524940110" protocol=ttrpc version=3 Jan 23 18:34:29.231442 systemd[1]: Started cri-containerd-bd42997247e2ee3e1a3eec1de21415f63214e9abeb1aee51d0ca4b534a418130.scope - libcontainer container bd42997247e2ee3e1a3eec1de21415f63214e9abeb1aee51d0ca4b534a418130. Jan 23 18:34:29.241000 audit: BPF prog-id=262 op=LOAD Jan 23 18:34:29.242000 audit: BPF prog-id=263 op=LOAD Jan 23 18:34:29.242000 audit[5475]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=2600 pid=5475 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:34:29.242000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6264343239393732343765326565336531613365656331646532313431 Jan 23 18:34:29.242000 audit: BPF prog-id=263 op=UNLOAD Jan 23 18:34:29.242000 audit[5475]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2600 pid=5475 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:34:29.242000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6264343239393732343765326565336531613365656331646532313431 Jan 23 18:34:29.242000 audit: BPF prog-id=264 op=LOAD Jan 23 18:34:29.242000 audit[5475]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=2600 pid=5475 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:34:29.242000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6264343239393732343765326565336531613365656331646532313431 Jan 23 18:34:29.242000 audit: BPF prog-id=265 op=LOAD Jan 23 18:34:29.242000 audit[5475]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=2600 pid=5475 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:34:29.242000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6264343239393732343765326565336531613365656331646532313431 Jan 23 18:34:29.243000 audit: BPF prog-id=265 op=UNLOAD Jan 23 18:34:29.243000 audit[5475]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2600 pid=5475 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:34:29.243000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6264343239393732343765326565336531613365656331646532313431 Jan 23 18:34:29.243000 audit: BPF prog-id=264 op=UNLOAD Jan 23 18:34:29.243000 audit[5475]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2600 pid=5475 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:34:29.243000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6264343239393732343765326565336531613365656331646532313431 Jan 23 18:34:29.243000 audit: BPF prog-id=266 op=LOAD Jan 23 18:34:29.243000 audit[5475]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=2600 pid=5475 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:34:29.243000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6264343239393732343765326565336531613365656331646532313431 Jan 23 18:34:29.279202 containerd[1690]: time="2026-01-23T18:34:29.279144203Z" level=info msg="StartContainer for \"bd42997247e2ee3e1a3eec1de21415f63214e9abeb1aee51d0ca4b534a418130\" returns successfully" Jan 23 18:34:30.377761 containerd[1690]: time="2026-01-23T18:34:30.377690568Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 23 18:34:30.750375 containerd[1690]: time="2026-01-23T18:34:30.750058948Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 18:34:30.751416 containerd[1690]: time="2026-01-23T18:34:30.751329115Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 23 18:34:30.751416 containerd[1690]: time="2026-01-23T18:34:30.751395643Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 23 18:34:30.751755 kubelet[2894]: E0123 18:34:30.751553 2894 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 23 18:34:30.751755 kubelet[2894]: E0123 18:34:30.751588 2894 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 23 18:34:30.751755 kubelet[2894]: E0123 18:34:30.751698 2894 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cn2n5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-8447c9595f-2lbtc_calico-system(be525a6b-06ca-4032-a777-c6e0f1c5eb71): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 23 18:34:30.752853 kubelet[2894]: E0123 18:34:30.752828 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-8447c9595f-2lbtc" podUID="be525a6b-06ca-4032-a777-c6e0f1c5eb71" Jan 23 18:34:30.892208 update_engine[1661]: I20260123 18:34:30.892001 1661 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Jan 23 18:34:30.892208 update_engine[1661]: I20260123 18:34:30.892045 1661 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Jan 23 18:34:30.893273 update_engine[1661]: I20260123 18:34:30.893095 1661 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Jan 23 18:34:30.894007 update_engine[1661]: I20260123 18:34:30.893986 1661 omaha_request_params.cc:62] Current group set to beta Jan 23 18:34:30.894682 update_engine[1661]: I20260123 18:34:30.894655 1661 update_attempter.cc:499] Already updated boot flags. Skipping. Jan 23 18:34:30.894932 update_engine[1661]: I20260123 18:34:30.894742 1661 update_attempter.cc:643] Scheduling an action processor start. Jan 23 18:34:30.894932 update_engine[1661]: I20260123 18:34:30.894760 1661 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Jan 23 18:34:30.895336 locksmithd[1705]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Jan 23 18:34:30.900686 update_engine[1661]: I20260123 18:34:30.900665 1661 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Jan 23 18:34:30.900804 update_engine[1661]: I20260123 18:34:30.900792 1661 omaha_request_action.cc:271] Posting an Omaha request to disabled Jan 23 18:34:30.900843 update_engine[1661]: I20260123 18:34:30.900835 1661 omaha_request_action.cc:272] Request: Jan 23 18:34:30.900843 update_engine[1661]: Jan 23 18:34:30.900843 update_engine[1661]: Jan 23 18:34:30.900843 update_engine[1661]: Jan 23 18:34:30.900843 update_engine[1661]: Jan 23 18:34:30.900843 update_engine[1661]: Jan 23 18:34:30.900843 update_engine[1661]: Jan 23 18:34:30.900843 update_engine[1661]: Jan 23 18:34:30.900843 update_engine[1661]: Jan 23 18:34:30.901909 update_engine[1661]: I20260123 18:34:30.901019 1661 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jan 23 18:34:30.904144 update_engine[1661]: I20260123 18:34:30.904126 1661 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jan 23 18:34:30.904643 update_engine[1661]: I20260123 18:34:30.904623 1661 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jan 23 18:34:30.910696 update_engine[1661]: E20260123 18:34:30.910676 1661 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled (Domain name not found) Jan 23 18:34:30.910794 update_engine[1661]: I20260123 18:34:30.910784 1661 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Jan 23 18:34:31.377755 kubelet[2894]: E0123 18:34:31.377689 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-775b766696-sj9gw" podUID="a7bedb6c-04ad-4dfc-97e0-53467bd29e69" Jan 23 18:34:32.377218 kubelet[2894]: E0123 18:34:32.377178 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-779b7ffd49-pbrfh" podUID="7a5d89a2-c80e-4f56-8808-99252854603a" Jan 23 18:34:32.852078 systemd[1]: cri-containerd-dd87d330c68e8a3cc2bc49d93f89d19f818bba102203298806d51e248a22ce45.scope: Deactivated successfully. Jan 23 18:34:32.852363 systemd[1]: cri-containerd-dd87d330c68e8a3cc2bc49d93f89d19f818bba102203298806d51e248a22ce45.scope: Consumed 2.359s CPU time, 23.7M memory peak. Jan 23 18:34:32.852000 audit: BPF prog-id=267 op=LOAD Jan 23 18:34:32.854893 kernel: kauditd_printk_skb: 40 callbacks suppressed Jan 23 18:34:32.854947 kernel: audit: type=1334 audit(1769193272.852:927): prog-id=267 op=LOAD Jan 23 18:34:32.856382 containerd[1690]: time="2026-01-23T18:34:32.856358043Z" level=info msg="received container exit event container_id:\"dd87d330c68e8a3cc2bc49d93f89d19f818bba102203298806d51e248a22ce45\" id:\"dd87d330c68e8a3cc2bc49d93f89d19f818bba102203298806d51e248a22ce45\" pid:2724 exit_status:1 exited_at:{seconds:1769193272 nanos:855132719}" Jan 23 18:34:32.852000 audit: BPF prog-id=88 op=UNLOAD Jan 23 18:34:32.857618 kernel: audit: type=1334 audit(1769193272.852:928): prog-id=88 op=UNLOAD Jan 23 18:34:32.858320 kernel: audit: type=1334 audit(1769193272.855:929): prog-id=98 op=UNLOAD Jan 23 18:34:32.855000 audit: BPF prog-id=98 op=UNLOAD Jan 23 18:34:32.855000 audit: BPF prog-id=102 op=UNLOAD Jan 23 18:34:32.860922 kernel: audit: type=1334 audit(1769193272.855:930): prog-id=102 op=UNLOAD Jan 23 18:34:32.877955 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-dd87d330c68e8a3cc2bc49d93f89d19f818bba102203298806d51e248a22ce45-rootfs.mount: Deactivated successfully. Jan 23 18:34:33.198988 kubelet[2894]: I0123 18:34:33.198961 2894 scope.go:117] "RemoveContainer" containerID="dd87d330c68e8a3cc2bc49d93f89d19f818bba102203298806d51e248a22ce45" Jan 23 18:34:33.200749 containerd[1690]: time="2026-01-23T18:34:33.200722045Z" level=info msg="CreateContainer within sandbox \"14feddcfa7e689a7a13361fe8be75a3160bae45bf60692314a983ebdba326143\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Jan 23 18:34:33.210895 containerd[1690]: time="2026-01-23T18:34:33.209396495Z" level=info msg="Container 784d488940622ce254a88065757c9d71f956a526cb1f60fc83c56ce0068604d0: CDI devices from CRI Config.CDIDevices: []" Jan 23 18:34:33.217562 containerd[1690]: time="2026-01-23T18:34:33.217541004Z" level=info msg="CreateContainer within sandbox \"14feddcfa7e689a7a13361fe8be75a3160bae45bf60692314a983ebdba326143\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"784d488940622ce254a88065757c9d71f956a526cb1f60fc83c56ce0068604d0\"" Jan 23 18:34:33.218063 containerd[1690]: time="2026-01-23T18:34:33.218049966Z" level=info msg="StartContainer for \"784d488940622ce254a88065757c9d71f956a526cb1f60fc83c56ce0068604d0\"" Jan 23 18:34:33.219072 containerd[1690]: time="2026-01-23T18:34:33.219055684Z" level=info msg="connecting to shim 784d488940622ce254a88065757c9d71f956a526cb1f60fc83c56ce0068604d0" address="unix:///run/containerd/s/6b12dc3141138823df98416ef9a8ddbad15a648507919503ca0bfa6903a13124" protocol=ttrpc version=3 Jan 23 18:34:33.237438 systemd[1]: Started cri-containerd-784d488940622ce254a88065757c9d71f956a526cb1f60fc83c56ce0068604d0.scope - libcontainer container 784d488940622ce254a88065757c9d71f956a526cb1f60fc83c56ce0068604d0. Jan 23 18:34:33.247000 audit: BPF prog-id=268 op=LOAD Jan 23 18:34:33.251298 kernel: audit: type=1334 audit(1769193273.247:931): prog-id=268 op=LOAD Jan 23 18:34:33.249000 audit: BPF prog-id=269 op=LOAD Jan 23 18:34:33.254283 kernel: audit: type=1334 audit(1769193273.249:932): prog-id=269 op=LOAD Jan 23 18:34:33.249000 audit[5531]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a0238 a2=98 a3=0 items=0 ppid=2596 pid=5531 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:34:33.260270 kernel: audit: type=1300 audit(1769193273.249:932): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a0238 a2=98 a3=0 items=0 ppid=2596 pid=5531 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:34:33.260320 kernel: audit: type=1327 audit(1769193273.249:932): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3738346434383839343036323263653235346138383036353735376339 Jan 23 18:34:33.249000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3738346434383839343036323263653235346138383036353735376339 Jan 23 18:34:33.249000 audit: BPF prog-id=269 op=UNLOAD Jan 23 18:34:33.270577 kernel: audit: type=1334 audit(1769193273.249:933): prog-id=269 op=UNLOAD Jan 23 18:34:33.270633 kernel: audit: type=1300 audit(1769193273.249:933): arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2596 pid=5531 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:34:33.249000 audit[5531]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2596 pid=5531 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:34:33.249000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3738346434383839343036323263653235346138383036353735376339 Jan 23 18:34:33.249000 audit: BPF prog-id=270 op=LOAD Jan 23 18:34:33.249000 audit[5531]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a0488 a2=98 a3=0 items=0 ppid=2596 pid=5531 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:34:33.249000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3738346434383839343036323263653235346138383036353735376339 Jan 23 18:34:33.249000 audit: BPF prog-id=271 op=LOAD Jan 23 18:34:33.249000 audit[5531]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c0001a0218 a2=98 a3=0 items=0 ppid=2596 pid=5531 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:34:33.249000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3738346434383839343036323263653235346138383036353735376339 Jan 23 18:34:33.249000 audit: BPF prog-id=271 op=UNLOAD Jan 23 18:34:33.249000 audit[5531]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2596 pid=5531 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:34:33.249000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3738346434383839343036323263653235346138383036353735376339 Jan 23 18:34:33.249000 audit: BPF prog-id=270 op=UNLOAD Jan 23 18:34:33.249000 audit[5531]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2596 pid=5531 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:34:33.249000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3738346434383839343036323263653235346138383036353735376339 Jan 23 18:34:33.250000 audit: BPF prog-id=272 op=LOAD Jan 23 18:34:33.250000 audit[5531]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a06e8 a2=98 a3=0 items=0 ppid=2596 pid=5531 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:34:33.250000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3738346434383839343036323263653235346138383036353735376339 Jan 23 18:34:33.296661 containerd[1690]: time="2026-01-23T18:34:33.296631013Z" level=info msg="StartContainer for \"784d488940622ce254a88065757c9d71f956a526cb1f60fc83c56ce0068604d0\" returns successfully" Jan 23 18:34:33.378675 kubelet[2894]: E0123 18:34:33.378644 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-9xqc6" podUID="dad22ee7-a9d6-4858-9e53-0db48fecba12" Jan 23 18:34:35.377777 kubelet[2894]: E0123 18:34:35.377717 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-j9dvq" podUID="e681e1b7-9935-4d75-8509-9acd7616e3d8" Jan 23 18:34:37.785059 kubelet[2894]: E0123 18:34:37.785005 2894 controller.go:195] "Failed to update lease" err="Put \"https://10.0.9.101:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4547-1-0-2-32611d5cc2?timeout=10s\": context deadline exceeded" Jan 23 18:34:37.959800 kubelet[2894]: I0123 18:34:37.959454 2894 status_manager.go:890] "Failed to get status for pod" podUID="b1519747-f595-415e-abc7-1bf97df28840" pod="tigera-operator/tigera-operator-7dcd859c48-vts8f" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.9.101:58770->10.0.9.65:2379: read: connection timed out" Jan 23 18:34:38.377306 kubelet[2894]: E0123 18:34:38.377241 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-779b7ffd49-78hkk" podUID="644e9aff-fbfa-4d8d-bb83-94a0bb426243" Jan 23 18:34:39.481216 systemd[1]: cri-containerd-001c5f3fe41dab572d50216a08fe8398bc609310bb98b24615d02d4aec4e421c.scope: Deactivated successfully. Jan 23 18:34:39.482392 containerd[1690]: time="2026-01-23T18:34:39.482177963Z" level=info msg="received container exit event container_id:\"001c5f3fe41dab572d50216a08fe8398bc609310bb98b24615d02d4aec4e421c\" id:\"001c5f3fe41dab572d50216a08fe8398bc609310bb98b24615d02d4aec4e421c\" pid:5444 exit_status:1 exited_at:{seconds:1769193279 nanos:481380207}" Jan 23 18:34:39.487897 kernel: kauditd_printk_skb: 16 callbacks suppressed Jan 23 18:34:39.487957 kernel: audit: type=1334 audit(1769193279.484:939): prog-id=256 op=UNLOAD Jan 23 18:34:39.484000 audit: BPF prog-id=256 op=UNLOAD Jan 23 18:34:39.484000 audit: BPF prog-id=260 op=UNLOAD Jan 23 18:34:39.491312 kernel: audit: type=1334 audit(1769193279.484:940): prog-id=260 op=UNLOAD Jan 23 18:34:39.503812 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-001c5f3fe41dab572d50216a08fe8398bc609310bb98b24615d02d4aec4e421c-rootfs.mount: Deactivated successfully. Jan 23 18:34:40.214702 kubelet[2894]: I0123 18:34:40.214683 2894 scope.go:117] "RemoveContainer" containerID="1602eaff6cd73e1631bf9dd9f7e00d690e7452702bb3d7c599059bcadf4e9dc0" Jan 23 18:34:40.215207 kubelet[2894]: I0123 18:34:40.214843 2894 scope.go:117] "RemoveContainer" containerID="001c5f3fe41dab572d50216a08fe8398bc609310bb98b24615d02d4aec4e421c" Jan 23 18:34:40.215207 kubelet[2894]: E0123 18:34:40.214964 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tigera-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=tigera-operator pod=tigera-operator-7dcd859c48-vts8f_tigera-operator(b1519747-f595-415e-abc7-1bf97df28840)\"" pod="tigera-operator/tigera-operator-7dcd859c48-vts8f" podUID="b1519747-f595-415e-abc7-1bf97df28840" Jan 23 18:34:40.216199 containerd[1690]: time="2026-01-23T18:34:40.216172412Z" level=info msg="RemoveContainer for \"1602eaff6cd73e1631bf9dd9f7e00d690e7452702bb3d7c599059bcadf4e9dc0\"" Jan 23 18:34:40.221118 containerd[1690]: time="2026-01-23T18:34:40.221096937Z" level=info msg="RemoveContainer for \"1602eaff6cd73e1631bf9dd9f7e00d690e7452702bb3d7c599059bcadf4e9dc0\" returns successfully" Jan 23 18:34:40.892102 update_engine[1661]: I20260123 18:34:40.891687 1661 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jan 23 18:34:40.892102 update_engine[1661]: I20260123 18:34:40.891772 1661 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jan 23 18:34:40.892102 update_engine[1661]: I20260123 18:34:40.892066 1661 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jan 23 18:34:40.901552 update_engine[1661]: E20260123 18:34:40.901510 1661 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled (Domain name not found) Jan 23 18:34:40.901733 update_engine[1661]: I20260123 18:34:40.901710 1661 libcurl_http_fetcher.cc:283] No HTTP response, retry 2