Jan 21 00:57:19.008902 kernel: Linux version 6.12.66-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.1_p20250801 p4) 14.3.1 20250801, GNU ld (Gentoo 2.45 p3) 2.45.0) #1 SMP PREEMPT_DYNAMIC Tue Jan 20 22:19:08 -00 2026 Jan 21 00:57:19.008938 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack verity.usrhash=febd26d0ecadb4f9abb44f6b2a89e793f13258cbb011a4bfe78289e5448c772a Jan 21 00:57:19.008948 kernel: BIOS-provided physical RAM map: Jan 21 00:57:19.008954 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ffff] usable Jan 21 00:57:19.008960 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000007fffff] usable Jan 21 00:57:19.008966 kernel: BIOS-e820: [mem 0x0000000000800000-0x0000000000807fff] ACPI NVS Jan 21 00:57:19.008975 kernel: BIOS-e820: [mem 0x0000000000808000-0x000000000080afff] usable Jan 21 00:57:19.008982 kernel: BIOS-e820: [mem 0x000000000080b000-0x000000000080bfff] ACPI NVS Jan 21 00:57:19.008988 kernel: BIOS-e820: [mem 0x000000000080c000-0x0000000000810fff] usable Jan 21 00:57:19.008994 kernel: BIOS-e820: [mem 0x0000000000811000-0x00000000008fffff] ACPI NVS Jan 21 00:57:19.009000 kernel: BIOS-e820: [mem 0x0000000000900000-0x000000007e93efff] usable Jan 21 00:57:19.009006 kernel: BIOS-e820: [mem 0x000000007e93f000-0x000000007e9fffff] reserved Jan 21 00:57:19.009012 kernel: BIOS-e820: [mem 0x000000007ea00000-0x000000007ec70fff] usable Jan 21 00:57:19.009018 kernel: BIOS-e820: [mem 0x000000007ec71000-0x000000007ed84fff] reserved Jan 21 00:57:19.009028 kernel: BIOS-e820: [mem 0x000000007ed85000-0x000000007f8ecfff] usable Jan 21 00:57:19.009035 kernel: BIOS-e820: [mem 0x000000007f8ed000-0x000000007fb6cfff] reserved Jan 21 00:57:19.009041 kernel: BIOS-e820: [mem 0x000000007fb6d000-0x000000007fb7efff] ACPI data Jan 21 00:57:19.009048 kernel: BIOS-e820: [mem 0x000000007fb7f000-0x000000007fbfefff] ACPI NVS Jan 21 00:57:19.009054 kernel: BIOS-e820: [mem 0x000000007fbff000-0x000000007feaefff] usable Jan 21 00:57:19.009061 kernel: BIOS-e820: [mem 0x000000007feaf000-0x000000007feb2fff] reserved Jan 21 00:57:19.009070 kernel: BIOS-e820: [mem 0x000000007feb3000-0x000000007feb4fff] ACPI NVS Jan 21 00:57:19.009076 kernel: BIOS-e820: [mem 0x000000007feb5000-0x000000007feebfff] usable Jan 21 00:57:19.009083 kernel: BIOS-e820: [mem 0x000000007feec000-0x000000007ff6ffff] reserved Jan 21 00:57:19.009089 kernel: BIOS-e820: [mem 0x000000007ff70000-0x000000007fffffff] ACPI NVS Jan 21 00:57:19.009096 kernel: BIOS-e820: [mem 0x00000000e0000000-0x00000000efffffff] reserved Jan 21 00:57:19.009102 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Jan 21 00:57:19.009109 kernel: BIOS-e820: [mem 0x00000000ffc00000-0x00000000ffffffff] reserved Jan 21 00:57:19.009116 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000017fffffff] usable Jan 21 00:57:19.009122 kernel: NX (Execute Disable) protection: active Jan 21 00:57:19.009129 kernel: APIC: Static calls initialized Jan 21 00:57:19.009136 kernel: e820: update [mem 0x7df7f018-0x7df88a57] usable ==> usable Jan 21 00:57:19.009145 kernel: e820: update [mem 0x7df57018-0x7df7e457] usable ==> usable Jan 21 00:57:19.009152 kernel: extended physical RAM map: Jan 21 00:57:19.009158 kernel: reserve setup_data: [mem 0x0000000000000000-0x000000000009ffff] usable Jan 21 00:57:19.009165 kernel: reserve setup_data: [mem 0x0000000000100000-0x00000000007fffff] usable Jan 21 00:57:19.009172 kernel: reserve setup_data: [mem 0x0000000000800000-0x0000000000807fff] ACPI NVS Jan 21 00:57:19.009178 kernel: reserve setup_data: [mem 0x0000000000808000-0x000000000080afff] usable Jan 21 00:57:19.009185 kernel: reserve setup_data: [mem 0x000000000080b000-0x000000000080bfff] ACPI NVS Jan 21 00:57:19.009191 kernel: reserve setup_data: [mem 0x000000000080c000-0x0000000000810fff] usable Jan 21 00:57:19.009198 kernel: reserve setup_data: [mem 0x0000000000811000-0x00000000008fffff] ACPI NVS Jan 21 00:57:19.009210 kernel: reserve setup_data: [mem 0x0000000000900000-0x000000007df57017] usable Jan 21 00:57:19.009217 kernel: reserve setup_data: [mem 0x000000007df57018-0x000000007df7e457] usable Jan 21 00:57:19.009224 kernel: reserve setup_data: [mem 0x000000007df7e458-0x000000007df7f017] usable Jan 21 00:57:19.009231 kernel: reserve setup_data: [mem 0x000000007df7f018-0x000000007df88a57] usable Jan 21 00:57:19.009240 kernel: reserve setup_data: [mem 0x000000007df88a58-0x000000007e93efff] usable Jan 21 00:57:19.009247 kernel: reserve setup_data: [mem 0x000000007e93f000-0x000000007e9fffff] reserved Jan 21 00:57:19.009254 kernel: reserve setup_data: [mem 0x000000007ea00000-0x000000007ec70fff] usable Jan 21 00:57:19.009261 kernel: reserve setup_data: [mem 0x000000007ec71000-0x000000007ed84fff] reserved Jan 21 00:57:19.009268 kernel: reserve setup_data: [mem 0x000000007ed85000-0x000000007f8ecfff] usable Jan 21 00:57:19.009275 kernel: reserve setup_data: [mem 0x000000007f8ed000-0x000000007fb6cfff] reserved Jan 21 00:57:19.009282 kernel: reserve setup_data: [mem 0x000000007fb6d000-0x000000007fb7efff] ACPI data Jan 21 00:57:19.009289 kernel: reserve setup_data: [mem 0x000000007fb7f000-0x000000007fbfefff] ACPI NVS Jan 21 00:57:19.009296 kernel: reserve setup_data: [mem 0x000000007fbff000-0x000000007feaefff] usable Jan 21 00:57:19.009303 kernel: reserve setup_data: [mem 0x000000007feaf000-0x000000007feb2fff] reserved Jan 21 00:57:19.009310 kernel: reserve setup_data: [mem 0x000000007feb3000-0x000000007feb4fff] ACPI NVS Jan 21 00:57:19.009318 kernel: reserve setup_data: [mem 0x000000007feb5000-0x000000007feebfff] usable Jan 21 00:57:19.009325 kernel: reserve setup_data: [mem 0x000000007feec000-0x000000007ff6ffff] reserved Jan 21 00:57:19.009332 kernel: reserve setup_data: [mem 0x000000007ff70000-0x000000007fffffff] ACPI NVS Jan 21 00:57:19.009339 kernel: reserve setup_data: [mem 0x00000000e0000000-0x00000000efffffff] reserved Jan 21 00:57:19.009346 kernel: reserve setup_data: [mem 0x00000000feffc000-0x00000000feffffff] reserved Jan 21 00:57:19.009353 kernel: reserve setup_data: [mem 0x00000000ffc00000-0x00000000ffffffff] reserved Jan 21 00:57:19.009360 kernel: reserve setup_data: [mem 0x0000000100000000-0x000000017fffffff] usable Jan 21 00:57:19.009367 kernel: efi: EFI v2.7 by EDK II Jan 21 00:57:19.009374 kernel: efi: SMBIOS=0x7f972000 ACPI=0x7fb7e000 ACPI 2.0=0x7fb7e014 MEMATTR=0x7dfd8018 RNG=0x7fb72018 Jan 21 00:57:19.009382 kernel: random: crng init done Jan 21 00:57:19.009389 kernel: efi: Remove mem139: MMIO range=[0xffc00000-0xffffffff] (4MB) from e820 map Jan 21 00:57:19.009398 kernel: e820: remove [mem 0xffc00000-0xffffffff] reserved Jan 21 00:57:19.009404 kernel: secureboot: Secure boot disabled Jan 21 00:57:19.009411 kernel: SMBIOS 2.8 present. Jan 21 00:57:19.009418 kernel: DMI: STACKIT Cloud OpenStack Nova/Standard PC (Q35 + ICH9, 2009), BIOS unknown 02/02/2022 Jan 21 00:57:19.009425 kernel: DMI: Memory slots populated: 1/1 Jan 21 00:57:19.009432 kernel: Hypervisor detected: KVM Jan 21 00:57:19.009439 kernel: last_pfn = 0x7feec max_arch_pfn = 0x10000000000 Jan 21 00:57:19.009446 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Jan 21 00:57:19.009453 kernel: kvm-clock: using sched offset of 5647540270 cycles Jan 21 00:57:19.009462 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Jan 21 00:57:19.009471 kernel: tsc: Detected 2294.608 MHz processor Jan 21 00:57:19.009479 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Jan 21 00:57:19.009487 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Jan 21 00:57:19.009494 kernel: last_pfn = 0x180000 max_arch_pfn = 0x10000000000 Jan 21 00:57:19.009502 kernel: MTRR map: 4 entries (2 fixed + 2 variable; max 18), built from 8 variable MTRRs Jan 21 00:57:19.009510 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Jan 21 00:57:19.009517 kernel: last_pfn = 0x7feec max_arch_pfn = 0x10000000000 Jan 21 00:57:19.009525 kernel: Using GB pages for direct mapping Jan 21 00:57:19.009534 kernel: ACPI: Early table checksum verification disabled Jan 21 00:57:19.009542 kernel: ACPI: RSDP 0x000000007FB7E014 000024 (v02 BOCHS ) Jan 21 00:57:19.009550 kernel: ACPI: XSDT 0x000000007FB7D0E8 00004C (v01 BOCHS BXPC 00000001 01000013) Jan 21 00:57:19.009558 kernel: ACPI: FACP 0x000000007FB77000 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jan 21 00:57:19.009565 kernel: ACPI: DSDT 0x000000007FB78000 00423C (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 21 00:57:19.009573 kernel: ACPI: FACS 0x000000007FBDD000 000040 Jan 21 00:57:19.009580 kernel: ACPI: APIC 0x000000007FB76000 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jan 21 00:57:19.009590 kernel: ACPI: MCFG 0x000000007FB75000 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 21 00:57:19.009597 kernel: ACPI: WAET 0x000000007FB74000 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 21 00:57:19.009605 kernel: ACPI: BGRT 0x000000007FB73000 000038 (v01 INTEL EDK2 00000002 01000013) Jan 21 00:57:19.009613 kernel: ACPI: Reserving FACP table memory at [mem 0x7fb77000-0x7fb770f3] Jan 21 00:57:19.009620 kernel: ACPI: Reserving DSDT table memory at [mem 0x7fb78000-0x7fb7c23b] Jan 21 00:57:19.009628 kernel: ACPI: Reserving FACS table memory at [mem 0x7fbdd000-0x7fbdd03f] Jan 21 00:57:19.009635 kernel: ACPI: Reserving APIC table memory at [mem 0x7fb76000-0x7fb7607f] Jan 21 00:57:19.009644 kernel: ACPI: Reserving MCFG table memory at [mem 0x7fb75000-0x7fb7503b] Jan 21 00:57:19.009652 kernel: ACPI: Reserving WAET table memory at [mem 0x7fb74000-0x7fb74027] Jan 21 00:57:19.009659 kernel: ACPI: Reserving BGRT table memory at [mem 0x7fb73000-0x7fb73037] Jan 21 00:57:19.009667 kernel: No NUMA configuration found Jan 21 00:57:19.009674 kernel: Faking a node at [mem 0x0000000000000000-0x000000017fffffff] Jan 21 00:57:19.009692 kernel: NODE_DATA(0) allocated [mem 0x17fff8dc0-0x17fffffff] Jan 21 00:57:19.009699 kernel: Zone ranges: Jan 21 00:57:19.009709 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Jan 21 00:57:19.009717 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] Jan 21 00:57:19.009725 kernel: Normal [mem 0x0000000100000000-0x000000017fffffff] Jan 21 00:57:19.009732 kernel: Device empty Jan 21 00:57:19.009740 kernel: Movable zone start for each node Jan 21 00:57:19.009747 kernel: Early memory node ranges Jan 21 00:57:19.009754 kernel: node 0: [mem 0x0000000000001000-0x000000000009ffff] Jan 21 00:57:19.009762 kernel: node 0: [mem 0x0000000000100000-0x00000000007fffff] Jan 21 00:57:19.009772 kernel: node 0: [mem 0x0000000000808000-0x000000000080afff] Jan 21 00:57:19.009779 kernel: node 0: [mem 0x000000000080c000-0x0000000000810fff] Jan 21 00:57:19.009787 kernel: node 0: [mem 0x0000000000900000-0x000000007e93efff] Jan 21 00:57:19.009794 kernel: node 0: [mem 0x000000007ea00000-0x000000007ec70fff] Jan 21 00:57:19.009801 kernel: node 0: [mem 0x000000007ed85000-0x000000007f8ecfff] Jan 21 00:57:19.009816 kernel: node 0: [mem 0x000000007fbff000-0x000000007feaefff] Jan 21 00:57:19.009827 kernel: node 0: [mem 0x000000007feb5000-0x000000007feebfff] Jan 21 00:57:19.009834 kernel: node 0: [mem 0x0000000100000000-0x000000017fffffff] Jan 21 00:57:19.009843 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000017fffffff] Jan 21 00:57:19.009850 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Jan 21 00:57:19.009861 kernel: On node 0, zone DMA: 96 pages in unavailable ranges Jan 21 00:57:19.009869 kernel: On node 0, zone DMA: 8 pages in unavailable ranges Jan 21 00:57:19.009877 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Jan 21 00:57:19.009886 kernel: On node 0, zone DMA: 239 pages in unavailable ranges Jan 21 00:57:19.009896 kernel: On node 0, zone DMA32: 193 pages in unavailable ranges Jan 21 00:57:19.009904 kernel: On node 0, zone DMA32: 276 pages in unavailable ranges Jan 21 00:57:19.009912 kernel: On node 0, zone DMA32: 786 pages in unavailable ranges Jan 21 00:57:19.009921 kernel: On node 0, zone DMA32: 6 pages in unavailable ranges Jan 21 00:57:19.009929 kernel: On node 0, zone Normal: 276 pages in unavailable ranges Jan 21 00:57:19.009937 kernel: ACPI: PM-Timer IO Port: 0x608 Jan 21 00:57:19.009946 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Jan 21 00:57:19.009956 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Jan 21 00:57:19.009965 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Jan 21 00:57:19.009973 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Jan 21 00:57:19.009981 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Jan 21 00:57:19.009989 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Jan 21 00:57:19.009998 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Jan 21 00:57:19.010006 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Jan 21 00:57:19.010016 kernel: TSC deadline timer available Jan 21 00:57:19.010025 kernel: CPU topo: Max. logical packages: 2 Jan 21 00:57:19.010033 kernel: CPU topo: Max. logical dies: 2 Jan 21 00:57:19.010041 kernel: CPU topo: Max. dies per package: 1 Jan 21 00:57:19.010049 kernel: CPU topo: Max. threads per core: 1 Jan 21 00:57:19.010057 kernel: CPU topo: Num. cores per package: 1 Jan 21 00:57:19.010065 kernel: CPU topo: Num. threads per package: 1 Jan 21 00:57:19.010073 kernel: CPU topo: Allowing 2 present CPUs plus 0 hotplug CPUs Jan 21 00:57:19.010083 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Jan 21 00:57:19.010091 kernel: kvm-guest: KVM setup pv remote TLB flush Jan 21 00:57:19.010099 kernel: kvm-guest: setup PV sched yield Jan 21 00:57:19.010108 kernel: [mem 0x80000000-0xdfffffff] available for PCI devices Jan 21 00:57:19.010116 kernel: Booting paravirtualized kernel on KVM Jan 21 00:57:19.010124 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Jan 21 00:57:19.010133 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 Jan 21 00:57:19.010143 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u1048576 Jan 21 00:57:19.010151 kernel: pcpu-alloc: s207832 r8192 d29736 u1048576 alloc=1*2097152 Jan 21 00:57:19.010160 kernel: pcpu-alloc: [0] 0 1 Jan 21 00:57:19.010168 kernel: kvm-guest: PV spinlocks enabled Jan 21 00:57:19.010176 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Jan 21 00:57:19.010186 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack verity.usrhash=febd26d0ecadb4f9abb44f6b2a89e793f13258cbb011a4bfe78289e5448c772a Jan 21 00:57:19.010194 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Jan 21 00:57:19.010205 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Jan 21 00:57:19.010213 kernel: Fallback order for Node 0: 0 Jan 21 00:57:19.010221 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1046694 Jan 21 00:57:19.010229 kernel: Policy zone: Normal Jan 21 00:57:19.010237 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Jan 21 00:57:19.010245 kernel: software IO TLB: area num 2. Jan 21 00:57:19.010253 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Jan 21 00:57:19.010263 kernel: ftrace: allocating 40097 entries in 157 pages Jan 21 00:57:19.010271 kernel: ftrace: allocated 157 pages with 5 groups Jan 21 00:57:19.010279 kernel: Dynamic Preempt: voluntary Jan 21 00:57:19.010288 kernel: rcu: Preemptible hierarchical RCU implementation. Jan 21 00:57:19.010297 kernel: rcu: RCU event tracing is enabled. Jan 21 00:57:19.010306 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Jan 21 00:57:19.010314 kernel: Trampoline variant of Tasks RCU enabled. Jan 21 00:57:19.010324 kernel: Rude variant of Tasks RCU enabled. Jan 21 00:57:19.010332 kernel: Tracing variant of Tasks RCU enabled. Jan 21 00:57:19.010340 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Jan 21 00:57:19.010348 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Jan 21 00:57:19.010357 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jan 21 00:57:19.010365 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jan 21 00:57:19.010374 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jan 21 00:57:19.010384 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 16 Jan 21 00:57:19.010392 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Jan 21 00:57:19.010400 kernel: Console: colour dummy device 80x25 Jan 21 00:57:19.010408 kernel: printk: legacy console [tty0] enabled Jan 21 00:57:19.010417 kernel: printk: legacy console [ttyS0] enabled Jan 21 00:57:19.010425 kernel: ACPI: Core revision 20240827 Jan 21 00:57:19.010433 kernel: APIC: Switch to symmetric I/O mode setup Jan 21 00:57:19.010441 kernel: x2apic enabled Jan 21 00:57:19.010451 kernel: APIC: Switched APIC routing to: physical x2apic Jan 21 00:57:19.010460 kernel: kvm-guest: APIC: send_IPI_mask() replaced with kvm_send_ipi_mask() Jan 21 00:57:19.010468 kernel: kvm-guest: APIC: send_IPI_mask_allbutself() replaced with kvm_send_ipi_mask_allbutself() Jan 21 00:57:19.010476 kernel: kvm-guest: setup PV IPIs Jan 21 00:57:19.010485 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x21134f58f0d, max_idle_ns: 440795217993 ns Jan 21 00:57:19.010493 kernel: Calibrating delay loop (skipped) preset value.. 4589.21 BogoMIPS (lpj=2294608) Jan 21 00:57:19.010501 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Jan 21 00:57:19.010512 kernel: Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 Jan 21 00:57:19.010519 kernel: Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 Jan 21 00:57:19.010527 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Jan 21 00:57:19.010535 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on syscall and VM exit Jan 21 00:57:19.010542 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Jan 21 00:57:19.010550 kernel: Spectre V2 : Spectre v2 / PBRSB-eIBRS: Retire a single CALL on VMEXIT Jan 21 00:57:19.010558 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Jan 21 00:57:19.010566 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Jan 21 00:57:19.010574 kernel: TAA: Mitigation: Clear CPU buffers Jan 21 00:57:19.010581 kernel: MMIO Stale Data: Mitigation: Clear CPU buffers Jan 21 00:57:19.010591 kernel: active return thunk: its_return_thunk Jan 21 00:57:19.010598 kernel: ITS: Mitigation: Aligned branch/return thunks Jan 21 00:57:19.010606 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Jan 21 00:57:19.010614 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Jan 21 00:57:19.010622 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Jan 21 00:57:19.010630 kernel: x86/fpu: Supporting XSAVE feature 0x020: 'AVX-512 opmask' Jan 21 00:57:19.010637 kernel: x86/fpu: Supporting XSAVE feature 0x040: 'AVX-512 Hi256' Jan 21 00:57:19.010645 kernel: x86/fpu: Supporting XSAVE feature 0x080: 'AVX-512 ZMM_Hi256' Jan 21 00:57:19.010653 kernel: x86/fpu: Supporting XSAVE feature 0x200: 'Protection Keys User registers' Jan 21 00:57:19.010663 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Jan 21 00:57:19.010671 kernel: x86/fpu: xstate_offset[5]: 832, xstate_sizes[5]: 64 Jan 21 00:57:19.010687 kernel: x86/fpu: xstate_offset[6]: 896, xstate_sizes[6]: 512 Jan 21 00:57:19.010696 kernel: x86/fpu: xstate_offset[7]: 1408, xstate_sizes[7]: 1024 Jan 21 00:57:19.010703 kernel: x86/fpu: xstate_offset[9]: 2432, xstate_sizes[9]: 8 Jan 21 00:57:19.010711 kernel: x86/fpu: Enabled xstate features 0x2e7, context size is 2440 bytes, using 'compacted' format. Jan 21 00:57:19.010719 kernel: Freeing SMP alternatives memory: 32K Jan 21 00:57:19.010727 kernel: pid_max: default: 32768 minimum: 301 Jan 21 00:57:19.010734 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Jan 21 00:57:19.010742 kernel: landlock: Up and running. Jan 21 00:57:19.010750 kernel: SELinux: Initializing. Jan 21 00:57:19.010757 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Jan 21 00:57:19.010767 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Jan 21 00:57:19.010775 kernel: smpboot: CPU0: Intel(R) Xeon(R) Silver 4316 CPU @ 2.30GHz (family: 0x6, model: 0x6a, stepping: 0x6) Jan 21 00:57:19.010783 kernel: Performance Events: PEBS fmt0-, Icelake events, full-width counters, Intel PMU driver. Jan 21 00:57:19.010791 kernel: ... version: 2 Jan 21 00:57:19.010800 kernel: ... bit width: 48 Jan 21 00:57:19.010808 kernel: ... generic registers: 8 Jan 21 00:57:19.010817 kernel: ... value mask: 0000ffffffffffff Jan 21 00:57:19.010825 kernel: ... max period: 00007fffffffffff Jan 21 00:57:19.010835 kernel: ... fixed-purpose events: 3 Jan 21 00:57:19.010843 kernel: ... event mask: 00000007000000ff Jan 21 00:57:19.010851 kernel: signal: max sigframe size: 3632 Jan 21 00:57:19.010860 kernel: rcu: Hierarchical SRCU implementation. Jan 21 00:57:19.010868 kernel: rcu: Max phase no-delay instances is 400. Jan 21 00:57:19.010876 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Jan 21 00:57:19.010884 kernel: smp: Bringing up secondary CPUs ... Jan 21 00:57:19.010894 kernel: smpboot: x86: Booting SMP configuration: Jan 21 00:57:19.010903 kernel: .... node #0, CPUs: #1 Jan 21 00:57:19.010911 kernel: smp: Brought up 1 node, 2 CPUs Jan 21 00:57:19.010919 kernel: smpboot: Total of 2 processors activated (9178.43 BogoMIPS) Jan 21 00:57:19.010928 kernel: Memory: 3969768K/4186776K available (14336K kernel code, 2445K rwdata, 31636K rodata, 15532K init, 2508K bss, 212128K reserved, 0K cma-reserved) Jan 21 00:57:19.010936 kernel: devtmpfs: initialized Jan 21 00:57:19.010945 kernel: x86/mm: Memory block size: 128MB Jan 21 00:57:19.010955 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x00800000-0x00807fff] (32768 bytes) Jan 21 00:57:19.010963 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x0080b000-0x0080bfff] (4096 bytes) Jan 21 00:57:19.010971 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x00811000-0x008fffff] (978944 bytes) Jan 21 00:57:19.010980 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7fb7f000-0x7fbfefff] (524288 bytes) Jan 21 00:57:19.010988 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7feb3000-0x7feb4fff] (8192 bytes) Jan 21 00:57:19.010996 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7ff70000-0x7fffffff] (589824 bytes) Jan 21 00:57:19.011005 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Jan 21 00:57:19.011015 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Jan 21 00:57:19.011023 kernel: pinctrl core: initialized pinctrl subsystem Jan 21 00:57:19.011032 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Jan 21 00:57:19.011040 kernel: audit: initializing netlink subsys (disabled) Jan 21 00:57:19.011048 kernel: audit: type=2000 audit(1768957035.042:1): state=initialized audit_enabled=0 res=1 Jan 21 00:57:19.011056 kernel: thermal_sys: Registered thermal governor 'step_wise' Jan 21 00:57:19.011064 kernel: thermal_sys: Registered thermal governor 'user_space' Jan 21 00:57:19.011075 kernel: cpuidle: using governor menu Jan 21 00:57:19.011083 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Jan 21 00:57:19.011091 kernel: dca service started, version 1.12.1 Jan 21 00:57:19.011100 kernel: PCI: ECAM [mem 0xe0000000-0xefffffff] (base 0xe0000000) for domain 0000 [bus 00-ff] Jan 21 00:57:19.011108 kernel: PCI: Using configuration type 1 for base access Jan 21 00:57:19.011116 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Jan 21 00:57:19.011124 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Jan 21 00:57:19.011135 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Jan 21 00:57:19.011143 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Jan 21 00:57:19.011151 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Jan 21 00:57:19.011159 kernel: ACPI: Added _OSI(Module Device) Jan 21 00:57:19.011167 kernel: ACPI: Added _OSI(Processor Device) Jan 21 00:57:19.011175 kernel: ACPI: Added _OSI(Processor Aggregator Device) Jan 21 00:57:19.011183 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Jan 21 00:57:19.011193 kernel: ACPI: Interpreter enabled Jan 21 00:57:19.011202 kernel: ACPI: PM: (supports S0 S3 S5) Jan 21 00:57:19.011210 kernel: ACPI: Using IOAPIC for interrupt routing Jan 21 00:57:19.011218 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Jan 21 00:57:19.011226 kernel: PCI: Using E820 reservations for host bridge windows Jan 21 00:57:19.011235 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Jan 21 00:57:19.011243 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Jan 21 00:57:19.011428 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Jan 21 00:57:19.011536 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] Jan 21 00:57:19.011634 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] Jan 21 00:57:19.011645 kernel: PCI host bridge to bus 0000:00 Jan 21 00:57:19.011769 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Jan 21 00:57:19.011859 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Jan 21 00:57:19.011949 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Jan 21 00:57:19.012035 kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xdfffffff window] Jan 21 00:57:19.012122 kernel: pci_bus 0000:00: root bus resource [mem 0xf0000000-0xfebfffff window] Jan 21 00:57:19.012207 kernel: pci_bus 0000:00: root bus resource [mem 0x380000000000-0x38e800003fff window] Jan 21 00:57:19.012294 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Jan 21 00:57:19.012409 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 conventional PCI endpoint Jan 21 00:57:19.012517 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint Jan 21 00:57:19.012631 kernel: pci 0000:00:01.0: BAR 0 [mem 0x80000000-0x807fffff pref] Jan 21 00:57:19.012781 kernel: pci 0000:00:01.0: BAR 2 [mem 0x38e800000000-0x38e800003fff 64bit pref] Jan 21 00:57:19.012875 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8439e000-0x8439efff] Jan 21 00:57:19.012976 kernel: pci 0000:00:01.0: ROM [mem 0xffff0000-0xffffffff pref] Jan 21 00:57:19.013083 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Jan 21 00:57:19.013193 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 21 00:57:19.013295 kernel: pci 0000:00:02.0: BAR 0 [mem 0x8439d000-0x8439dfff] Jan 21 00:57:19.013392 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Jan 21 00:57:19.013487 kernel: pci 0000:00:02.0: bridge window [io 0x6000-0x6fff] Jan 21 00:57:19.013587 kernel: pci 0000:00:02.0: bridge window [mem 0x84000000-0x842fffff] Jan 21 00:57:19.013693 kernel: pci 0000:00:02.0: bridge window [mem 0x380000000000-0x3807ffffffff 64bit pref] Jan 21 00:57:19.013797 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 21 00:57:19.013892 kernel: pci 0000:00:02.1: BAR 0 [mem 0x8439c000-0x8439cfff] Jan 21 00:57:19.013988 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Jan 21 00:57:19.014083 kernel: pci 0000:00:02.1: bridge window [mem 0x83e00000-0x83ffffff] Jan 21 00:57:19.014182 kernel: pci 0000:00:02.1: bridge window [mem 0x380800000000-0x380fffffffff 64bit pref] Jan 21 00:57:19.014287 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 21 00:57:19.014387 kernel: pci 0000:00:02.2: BAR 0 [mem 0x8439b000-0x8439bfff] Jan 21 00:57:19.014484 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Jan 21 00:57:19.014580 kernel: pci 0000:00:02.2: bridge window [mem 0x83c00000-0x83dfffff] Jan 21 00:57:19.014674 kernel: pci 0000:00:02.2: bridge window [mem 0x381000000000-0x3817ffffffff 64bit pref] Jan 21 00:57:19.014792 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 21 00:57:19.014888 kernel: pci 0000:00:02.3: BAR 0 [mem 0x8439a000-0x8439afff] Jan 21 00:57:19.014985 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Jan 21 00:57:19.015097 kernel: pci 0000:00:02.3: bridge window [mem 0x83a00000-0x83bfffff] Jan 21 00:57:19.015205 kernel: pci 0000:00:02.3: bridge window [mem 0x381800000000-0x381fffffffff 64bit pref] Jan 21 00:57:19.015314 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 21 00:57:19.015417 kernel: pci 0000:00:02.4: BAR 0 [mem 0x84399000-0x84399fff] Jan 21 00:57:19.015516 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Jan 21 00:57:19.015613 kernel: pci 0000:00:02.4: bridge window [mem 0x83800000-0x839fffff] Jan 21 00:57:19.015725 kernel: pci 0000:00:02.4: bridge window [mem 0x382000000000-0x3827ffffffff 64bit pref] Jan 21 00:57:19.015837 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 21 00:57:19.015937 kernel: pci 0000:00:02.5: BAR 0 [mem 0x84398000-0x84398fff] Jan 21 00:57:19.016042 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Jan 21 00:57:19.016142 kernel: pci 0000:00:02.5: bridge window [mem 0x83600000-0x837fffff] Jan 21 00:57:19.016237 kernel: pci 0000:00:02.5: bridge window [mem 0x382800000000-0x382fffffffff 64bit pref] Jan 21 00:57:19.016343 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 21 00:57:19.016443 kernel: pci 0000:00:02.6: BAR 0 [mem 0x84397000-0x84397fff] Jan 21 00:57:19.016543 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Jan 21 00:57:19.016695 kernel: pci 0000:00:02.6: bridge window [mem 0x83400000-0x835fffff] Jan 21 00:57:19.016796 kernel: pci 0000:00:02.6: bridge window [mem 0x383000000000-0x3837ffffffff 64bit pref] Jan 21 00:57:19.016904 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 21 00:57:19.017003 kernel: pci 0000:00:02.7: BAR 0 [mem 0x84396000-0x84396fff] Jan 21 00:57:19.017108 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Jan 21 00:57:19.017215 kernel: pci 0000:00:02.7: bridge window [mem 0x83200000-0x833fffff] Jan 21 00:57:19.017321 kernel: pci 0000:00:02.7: bridge window [mem 0x383800000000-0x383fffffffff 64bit pref] Jan 21 00:57:19.017430 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 21 00:57:19.017532 kernel: pci 0000:00:03.0: BAR 0 [mem 0x84395000-0x84395fff] Jan 21 00:57:19.017629 kernel: pci 0000:00:03.0: PCI bridge to [bus 0a] Jan 21 00:57:19.017741 kernel: pci 0000:00:03.0: bridge window [mem 0x83000000-0x831fffff] Jan 21 00:57:19.017839 kernel: pci 0000:00:03.0: bridge window [mem 0x384000000000-0x3847ffffffff 64bit pref] Jan 21 00:57:19.017950 kernel: pci 0000:00:03.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 21 00:57:19.018063 kernel: pci 0000:00:03.1: BAR 0 [mem 0x84394000-0x84394fff] Jan 21 00:57:19.018162 kernel: pci 0000:00:03.1: PCI bridge to [bus 0b] Jan 21 00:57:19.018258 kernel: pci 0000:00:03.1: bridge window [mem 0x82e00000-0x82ffffff] Jan 21 00:57:19.018359 kernel: pci 0000:00:03.1: bridge window [mem 0x384800000000-0x384fffffffff 64bit pref] Jan 21 00:57:19.018463 kernel: pci 0000:00:03.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 21 00:57:19.018559 kernel: pci 0000:00:03.2: BAR 0 [mem 0x84393000-0x84393fff] Jan 21 00:57:19.018660 kernel: pci 0000:00:03.2: PCI bridge to [bus 0c] Jan 21 00:57:19.018767 kernel: pci 0000:00:03.2: bridge window [mem 0x82c00000-0x82dfffff] Jan 21 00:57:19.018863 kernel: pci 0000:00:03.2: bridge window [mem 0x385000000000-0x3857ffffffff 64bit pref] Jan 21 00:57:19.018971 kernel: pci 0000:00:03.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 21 00:57:19.019070 kernel: pci 0000:00:03.3: BAR 0 [mem 0x84392000-0x84392fff] Jan 21 00:57:19.019170 kernel: pci 0000:00:03.3: PCI bridge to [bus 0d] Jan 21 00:57:19.019266 kernel: pci 0000:00:03.3: bridge window [mem 0x82a00000-0x82bfffff] Jan 21 00:57:19.019364 kernel: pci 0000:00:03.3: bridge window [mem 0x385800000000-0x385fffffffff 64bit pref] Jan 21 00:57:19.019467 kernel: pci 0000:00:03.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 21 00:57:19.019567 kernel: pci 0000:00:03.4: BAR 0 [mem 0x84391000-0x84391fff] Jan 21 00:57:19.019661 kernel: pci 0000:00:03.4: PCI bridge to [bus 0e] Jan 21 00:57:19.019770 kernel: pci 0000:00:03.4: bridge window [mem 0x82800000-0x829fffff] Jan 21 00:57:19.019866 kernel: pci 0000:00:03.4: bridge window [mem 0x386000000000-0x3867ffffffff 64bit pref] Jan 21 00:57:19.019973 kernel: pci 0000:00:03.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 21 00:57:19.020070 kernel: pci 0000:00:03.5: BAR 0 [mem 0x84390000-0x84390fff] Jan 21 00:57:19.020166 kernel: pci 0000:00:03.5: PCI bridge to [bus 0f] Jan 21 00:57:19.020261 kernel: pci 0000:00:03.5: bridge window [mem 0x82600000-0x827fffff] Jan 21 00:57:19.020356 kernel: pci 0000:00:03.5: bridge window [mem 0x386800000000-0x386fffffffff 64bit pref] Jan 21 00:57:19.020461 kernel: pci 0000:00:03.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 21 00:57:19.020561 kernel: pci 0000:00:03.6: BAR 0 [mem 0x8438f000-0x8438ffff] Jan 21 00:57:19.020670 kernel: pci 0000:00:03.6: PCI bridge to [bus 10] Jan 21 00:57:19.020781 kernel: pci 0000:00:03.6: bridge window [mem 0x82400000-0x825fffff] Jan 21 00:57:19.020876 kernel: pci 0000:00:03.6: bridge window [mem 0x387000000000-0x3877ffffffff 64bit pref] Jan 21 00:57:19.020979 kernel: pci 0000:00:03.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 21 00:57:19.021075 kernel: pci 0000:00:03.7: BAR 0 [mem 0x8438e000-0x8438efff] Jan 21 00:57:19.021176 kernel: pci 0000:00:03.7: PCI bridge to [bus 11] Jan 21 00:57:19.021271 kernel: pci 0000:00:03.7: bridge window [mem 0x82200000-0x823fffff] Jan 21 00:57:19.021366 kernel: pci 0000:00:03.7: bridge window [mem 0x387800000000-0x387fffffffff 64bit pref] Jan 21 00:57:19.021468 kernel: pci 0000:00:04.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 21 00:57:19.021567 kernel: pci 0000:00:04.0: BAR 0 [mem 0x8438d000-0x8438dfff] Jan 21 00:57:19.021672 kernel: pci 0000:00:04.0: PCI bridge to [bus 12] Jan 21 00:57:19.021796 kernel: pci 0000:00:04.0: bridge window [mem 0x82000000-0x821fffff] Jan 21 00:57:19.021896 kernel: pci 0000:00:04.0: bridge window [mem 0x388000000000-0x3887ffffffff 64bit pref] Jan 21 00:57:19.022004 kernel: pci 0000:00:04.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 21 00:57:19.022108 kernel: pci 0000:00:04.1: BAR 0 [mem 0x8438c000-0x8438cfff] Jan 21 00:57:19.022211 kernel: pci 0000:00:04.1: PCI bridge to [bus 13] Jan 21 00:57:19.022316 kernel: pci 0000:00:04.1: bridge window [mem 0x81e00000-0x81ffffff] Jan 21 00:57:19.022413 kernel: pci 0000:00:04.1: bridge window [mem 0x388800000000-0x388fffffffff 64bit pref] Jan 21 00:57:19.022528 kernel: pci 0000:00:04.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 21 00:57:19.022628 kernel: pci 0000:00:04.2: BAR 0 [mem 0x8438b000-0x8438bfff] Jan 21 00:57:19.022739 kernel: pci 0000:00:04.2: PCI bridge to [bus 14] Jan 21 00:57:19.022837 kernel: pci 0000:00:04.2: bridge window [mem 0x81c00000-0x81dfffff] Jan 21 00:57:19.022936 kernel: pci 0000:00:04.2: bridge window [mem 0x389000000000-0x3897ffffffff 64bit pref] Jan 21 00:57:19.023045 kernel: pci 0000:00:04.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 21 00:57:19.023142 kernel: pci 0000:00:04.3: BAR 0 [mem 0x8438a000-0x8438afff] Jan 21 00:57:19.023238 kernel: pci 0000:00:04.3: PCI bridge to [bus 15] Jan 21 00:57:19.023330 kernel: pci 0000:00:04.3: bridge window [mem 0x81a00000-0x81bfffff] Jan 21 00:57:19.023425 kernel: pci 0000:00:04.3: bridge window [mem 0x389800000000-0x389fffffffff 64bit pref] Jan 21 00:57:19.023530 kernel: pci 0000:00:04.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 21 00:57:19.023628 kernel: pci 0000:00:04.4: BAR 0 [mem 0x84389000-0x84389fff] Jan 21 00:57:19.023740 kernel: pci 0000:00:04.4: PCI bridge to [bus 16] Jan 21 00:57:19.023837 kernel: pci 0000:00:04.4: bridge window [mem 0x81800000-0x819fffff] Jan 21 00:57:19.023933 kernel: pci 0000:00:04.4: bridge window [mem 0x38a000000000-0x38a7ffffffff 64bit pref] Jan 21 00:57:19.024037 kernel: pci 0000:00:04.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 21 00:57:19.024140 kernel: pci 0000:00:04.5: BAR 0 [mem 0x84388000-0x84388fff] Jan 21 00:57:19.024238 kernel: pci 0000:00:04.5: PCI bridge to [bus 17] Jan 21 00:57:19.024335 kernel: pci 0000:00:04.5: bridge window [mem 0x81600000-0x817fffff] Jan 21 00:57:19.024432 kernel: pci 0000:00:04.5: bridge window [mem 0x38a800000000-0x38afffffffff 64bit pref] Jan 21 00:57:19.024541 kernel: pci 0000:00:04.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 21 00:57:19.024652 kernel: pci 0000:00:04.6: BAR 0 [mem 0x84387000-0x84387fff] Jan 21 00:57:19.024765 kernel: pci 0000:00:04.6: PCI bridge to [bus 18] Jan 21 00:57:19.024861 kernel: pci 0000:00:04.6: bridge window [mem 0x81400000-0x815fffff] Jan 21 00:57:19.024957 kernel: pci 0000:00:04.6: bridge window [mem 0x38b000000000-0x38b7ffffffff 64bit pref] Jan 21 00:57:19.025060 kernel: pci 0000:00:04.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 21 00:57:19.025159 kernel: pci 0000:00:04.7: BAR 0 [mem 0x84386000-0x84386fff] Jan 21 00:57:19.025260 kernel: pci 0000:00:04.7: PCI bridge to [bus 19] Jan 21 00:57:19.025355 kernel: pci 0000:00:04.7: bridge window [mem 0x81200000-0x813fffff] Jan 21 00:57:19.025447 kernel: pci 0000:00:04.7: bridge window [mem 0x38b800000000-0x38bfffffffff 64bit pref] Jan 21 00:57:19.025545 kernel: pci 0000:00:05.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 21 00:57:19.025644 kernel: pci 0000:00:05.0: BAR 0 [mem 0x84385000-0x84385fff] Jan 21 00:57:19.025760 kernel: pci 0000:00:05.0: PCI bridge to [bus 1a] Jan 21 00:57:19.025856 kernel: pci 0000:00:05.0: bridge window [mem 0x81000000-0x811fffff] Jan 21 00:57:19.025953 kernel: pci 0000:00:05.0: bridge window [mem 0x38c000000000-0x38c7ffffffff 64bit pref] Jan 21 00:57:19.026058 kernel: pci 0000:00:05.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 21 00:57:19.026160 kernel: pci 0000:00:05.1: BAR 0 [mem 0x84384000-0x84384fff] Jan 21 00:57:19.026263 kernel: pci 0000:00:05.1: PCI bridge to [bus 1b] Jan 21 00:57:19.026361 kernel: pci 0000:00:05.1: bridge window [mem 0x80e00000-0x80ffffff] Jan 21 00:57:19.026460 kernel: pci 0000:00:05.1: bridge window [mem 0x38c800000000-0x38cfffffffff 64bit pref] Jan 21 00:57:19.026569 kernel: pci 0000:00:05.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 21 00:57:19.026666 kernel: pci 0000:00:05.2: BAR 0 [mem 0x84383000-0x84383fff] Jan 21 00:57:19.026782 kernel: pci 0000:00:05.2: PCI bridge to [bus 1c] Jan 21 00:57:19.026885 kernel: pci 0000:00:05.2: bridge window [mem 0x80c00000-0x80dfffff] Jan 21 00:57:19.026981 kernel: pci 0000:00:05.2: bridge window [mem 0x38d000000000-0x38d7ffffffff 64bit pref] Jan 21 00:57:19.027085 kernel: pci 0000:00:05.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 21 00:57:19.027184 kernel: pci 0000:00:05.3: BAR 0 [mem 0x84382000-0x84382fff] Jan 21 00:57:19.027281 kernel: pci 0000:00:05.3: PCI bridge to [bus 1d] Jan 21 00:57:19.027378 kernel: pci 0000:00:05.3: bridge window [mem 0x80a00000-0x80bfffff] Jan 21 00:57:19.027478 kernel: pci 0000:00:05.3: bridge window [mem 0x38d800000000-0x38dfffffffff 64bit pref] Jan 21 00:57:19.027586 kernel: pci 0000:00:05.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 21 00:57:19.027695 kernel: pci 0000:00:05.4: BAR 0 [mem 0x84381000-0x84381fff] Jan 21 00:57:19.027788 kernel: pci 0000:00:05.4: PCI bridge to [bus 1e] Jan 21 00:57:19.027879 kernel: pci 0000:00:05.4: bridge window [mem 0x80800000-0x809fffff] Jan 21 00:57:19.027970 kernel: pci 0000:00:05.4: bridge window [mem 0x38e000000000-0x38e7ffffffff 64bit pref] Jan 21 00:57:19.028072 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 conventional PCI endpoint Jan 21 00:57:19.028164 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Jan 21 00:57:19.028266 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 conventional PCI endpoint Jan 21 00:57:19.028369 kernel: pci 0000:00:1f.2: BAR 4 [io 0x7040-0x705f] Jan 21 00:57:19.028468 kernel: pci 0000:00:1f.2: BAR 5 [mem 0x84380000-0x84380fff] Jan 21 00:57:19.028633 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 conventional PCI endpoint Jan 21 00:57:19.028777 kernel: pci 0000:00:1f.3: BAR 4 [io 0x7000-0x703f] Jan 21 00:57:19.028885 kernel: pci 0000:01:00.0: [1b36:000e] type 01 class 0x060400 PCIe to PCI/PCI-X bridge Jan 21 00:57:19.028984 kernel: pci 0000:01:00.0: BAR 0 [mem 0x84200000-0x842000ff 64bit] Jan 21 00:57:19.029082 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Jan 21 00:57:19.029183 kernel: pci 0000:01:00.0: bridge window [io 0x6000-0x6fff] Jan 21 00:57:19.029283 kernel: pci 0000:01:00.0: bridge window [mem 0x84000000-0x841fffff] Jan 21 00:57:19.029383 kernel: pci 0000:01:00.0: bridge window [mem 0x380000000000-0x3807ffffffff 64bit pref] Jan 21 00:57:19.029482 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Jan 21 00:57:19.029592 kernel: pci_bus 0000:02: extended config space not accessible Jan 21 00:57:19.029605 kernel: acpiphp: Slot [1] registered Jan 21 00:57:19.029614 kernel: acpiphp: Slot [0] registered Jan 21 00:57:19.029626 kernel: acpiphp: Slot [2] registered Jan 21 00:57:19.029635 kernel: acpiphp: Slot [3] registered Jan 21 00:57:19.029643 kernel: acpiphp: Slot [4] registered Jan 21 00:57:19.029652 kernel: acpiphp: Slot [5] registered Jan 21 00:57:19.029661 kernel: acpiphp: Slot [6] registered Jan 21 00:57:19.029669 kernel: acpiphp: Slot [7] registered Jan 21 00:57:19.029678 kernel: acpiphp: Slot [8] registered Jan 21 00:57:19.029694 kernel: acpiphp: Slot [9] registered Jan 21 00:57:19.029705 kernel: acpiphp: Slot [10] registered Jan 21 00:57:19.029714 kernel: acpiphp: Slot [11] registered Jan 21 00:57:19.029722 kernel: acpiphp: Slot [12] registered Jan 21 00:57:19.029731 kernel: acpiphp: Slot [13] registered Jan 21 00:57:19.029740 kernel: acpiphp: Slot [14] registered Jan 21 00:57:19.029748 kernel: acpiphp: Slot [15] registered Jan 21 00:57:19.029757 kernel: acpiphp: Slot [16] registered Jan 21 00:57:19.029775 kernel: acpiphp: Slot [17] registered Jan 21 00:57:19.029784 kernel: acpiphp: Slot [18] registered Jan 21 00:57:19.029793 kernel: acpiphp: Slot [19] registered Jan 21 00:57:19.029801 kernel: acpiphp: Slot [20] registered Jan 21 00:57:19.029810 kernel: acpiphp: Slot [21] registered Jan 21 00:57:19.029818 kernel: acpiphp: Slot [22] registered Jan 21 00:57:19.029827 kernel: acpiphp: Slot [23] registered Jan 21 00:57:19.029835 kernel: acpiphp: Slot [24] registered Jan 21 00:57:19.029846 kernel: acpiphp: Slot [25] registered Jan 21 00:57:19.029854 kernel: acpiphp: Slot [26] registered Jan 21 00:57:19.029863 kernel: acpiphp: Slot [27] registered Jan 21 00:57:19.029872 kernel: acpiphp: Slot [28] registered Jan 21 00:57:19.029880 kernel: acpiphp: Slot [29] registered Jan 21 00:57:19.029888 kernel: acpiphp: Slot [30] registered Jan 21 00:57:19.029897 kernel: acpiphp: Slot [31] registered Jan 21 00:57:19.032578 kernel: pci 0000:02:01.0: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint Jan 21 00:57:19.032790 kernel: pci 0000:02:01.0: BAR 4 [io 0x6000-0x601f] Jan 21 00:57:19.032896 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Jan 21 00:57:19.032909 kernel: acpiphp: Slot [0-2] registered Jan 21 00:57:19.033022 kernel: pci 0000:03:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Jan 21 00:57:19.033130 kernel: pci 0000:03:00.0: BAR 1 [mem 0x83e00000-0x83e00fff] Jan 21 00:57:19.033239 kernel: pci 0000:03:00.0: BAR 4 [mem 0x380800000000-0x380800003fff 64bit pref] Jan 21 00:57:19.033337 kernel: pci 0000:03:00.0: ROM [mem 0xfff80000-0xffffffff pref] Jan 21 00:57:19.033439 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Jan 21 00:57:19.033452 kernel: acpiphp: Slot [0-3] registered Jan 21 00:57:19.033559 kernel: pci 0000:04:00.0: [1af4:1042] type 00 class 0x010000 PCIe Endpoint Jan 21 00:57:19.033658 kernel: pci 0000:04:00.0: BAR 1 [mem 0x83c00000-0x83c00fff] Jan 21 00:57:19.033771 kernel: pci 0000:04:00.0: BAR 4 [mem 0x381000000000-0x381000003fff 64bit pref] Jan 21 00:57:19.033870 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Jan 21 00:57:19.033883 kernel: acpiphp: Slot [0-4] registered Jan 21 00:57:19.033988 kernel: pci 0000:05:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Jan 21 00:57:19.034087 kernel: pci 0000:05:00.0: BAR 4 [mem 0x381800000000-0x381800003fff 64bit pref] Jan 21 00:57:19.034185 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Jan 21 00:57:19.034199 kernel: acpiphp: Slot [0-5] registered Jan 21 00:57:19.034304 kernel: pci 0000:06:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Jan 21 00:57:19.034402 kernel: pci 0000:06:00.0: BAR 1 [mem 0x83800000-0x83800fff] Jan 21 00:57:19.034500 kernel: pci 0000:06:00.0: BAR 4 [mem 0x382000000000-0x382000003fff 64bit pref] Jan 21 00:57:19.034600 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Jan 21 00:57:19.034611 kernel: acpiphp: Slot [0-6] registered Jan 21 00:57:19.040303 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Jan 21 00:57:19.040343 kernel: acpiphp: Slot [0-7] registered Jan 21 00:57:19.040468 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Jan 21 00:57:19.040482 kernel: acpiphp: Slot [0-8] registered Jan 21 00:57:19.040624 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Jan 21 00:57:19.040637 kernel: acpiphp: Slot [0-9] registered Jan 21 00:57:19.040760 kernel: pci 0000:00:03.0: PCI bridge to [bus 0a] Jan 21 00:57:19.040772 kernel: acpiphp: Slot [0-10] registered Jan 21 00:57:19.040876 kernel: pci 0000:00:03.1: PCI bridge to [bus 0b] Jan 21 00:57:19.040888 kernel: acpiphp: Slot [0-11] registered Jan 21 00:57:19.040986 kernel: pci 0000:00:03.2: PCI bridge to [bus 0c] Jan 21 00:57:19.040998 kernel: acpiphp: Slot [0-12] registered Jan 21 00:57:19.041097 kernel: pci 0000:00:03.3: PCI bridge to [bus 0d] Jan 21 00:57:19.041112 kernel: acpiphp: Slot [0-13] registered Jan 21 00:57:19.041225 kernel: pci 0000:00:03.4: PCI bridge to [bus 0e] Jan 21 00:57:19.041253 kernel: acpiphp: Slot [0-14] registered Jan 21 00:57:19.041461 kernel: pci 0000:00:03.5: PCI bridge to [bus 0f] Jan 21 00:57:19.041489 kernel: acpiphp: Slot [0-15] registered Jan 21 00:57:19.041679 kernel: pci 0000:00:03.6: PCI bridge to [bus 10] Jan 21 00:57:19.041791 kernel: acpiphp: Slot [0-16] registered Jan 21 00:57:19.041932 kernel: pci 0000:00:03.7: PCI bridge to [bus 11] Jan 21 00:57:19.041945 kernel: acpiphp: Slot [0-17] registered Jan 21 00:57:19.042043 kernel: pci 0000:00:04.0: PCI bridge to [bus 12] Jan 21 00:57:19.042056 kernel: acpiphp: Slot [0-18] registered Jan 21 00:57:19.042156 kernel: pci 0000:00:04.1: PCI bridge to [bus 13] Jan 21 00:57:19.042170 kernel: acpiphp: Slot [0-19] registered Jan 21 00:57:19.042270 kernel: pci 0000:00:04.2: PCI bridge to [bus 14] Jan 21 00:57:19.042281 kernel: acpiphp: Slot [0-20] registered Jan 21 00:57:19.042379 kernel: pci 0000:00:04.3: PCI bridge to [bus 15] Jan 21 00:57:19.042390 kernel: acpiphp: Slot [0-21] registered Jan 21 00:57:19.042489 kernel: pci 0000:00:04.4: PCI bridge to [bus 16] Jan 21 00:57:19.042501 kernel: acpiphp: Slot [0-22] registered Jan 21 00:57:19.042603 kernel: pci 0000:00:04.5: PCI bridge to [bus 17] Jan 21 00:57:19.042614 kernel: acpiphp: Slot [0-23] registered Jan 21 00:57:19.042729 kernel: pci 0000:00:04.6: PCI bridge to [bus 18] Jan 21 00:57:19.042741 kernel: acpiphp: Slot [0-24] registered Jan 21 00:57:19.042840 kernel: pci 0000:00:04.7: PCI bridge to [bus 19] Jan 21 00:57:19.042852 kernel: acpiphp: Slot [0-25] registered Jan 21 00:57:19.042951 kernel: pci 0000:00:05.0: PCI bridge to [bus 1a] Jan 21 00:57:19.042962 kernel: acpiphp: Slot [0-26] registered Jan 21 00:57:19.043059 kernel: pci 0000:00:05.1: PCI bridge to [bus 1b] Jan 21 00:57:19.043071 kernel: acpiphp: Slot [0-27] registered Jan 21 00:57:19.043167 kernel: pci 0000:00:05.2: PCI bridge to [bus 1c] Jan 21 00:57:19.043179 kernel: acpiphp: Slot [0-28] registered Jan 21 00:57:19.043277 kernel: pci 0000:00:05.3: PCI bridge to [bus 1d] Jan 21 00:57:19.043292 kernel: acpiphp: Slot [0-29] registered Jan 21 00:57:19.043389 kernel: pci 0000:00:05.4: PCI bridge to [bus 1e] Jan 21 00:57:19.043401 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Jan 21 00:57:19.043410 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Jan 21 00:57:19.043419 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Jan 21 00:57:19.043428 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Jan 21 00:57:19.043439 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Jan 21 00:57:19.043449 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Jan 21 00:57:19.043457 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Jan 21 00:57:19.043466 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Jan 21 00:57:19.043475 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Jan 21 00:57:19.043484 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Jan 21 00:57:19.043493 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Jan 21 00:57:19.043503 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Jan 21 00:57:19.043512 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Jan 21 00:57:19.043520 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Jan 21 00:57:19.043529 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Jan 21 00:57:19.043538 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Jan 21 00:57:19.043547 kernel: iommu: Default domain type: Translated Jan 21 00:57:19.043556 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Jan 21 00:57:19.043567 kernel: efivars: Registered efivars operations Jan 21 00:57:19.043575 kernel: PCI: Using ACPI for IRQ routing Jan 21 00:57:19.043584 kernel: PCI: pci_cache_line_size set to 64 bytes Jan 21 00:57:19.043593 kernel: e820: reserve RAM buffer [mem 0x0080b000-0x008fffff] Jan 21 00:57:19.043601 kernel: e820: reserve RAM buffer [mem 0x00811000-0x008fffff] Jan 21 00:57:19.043610 kernel: e820: reserve RAM buffer [mem 0x7df57018-0x7fffffff] Jan 21 00:57:19.043618 kernel: e820: reserve RAM buffer [mem 0x7df7f018-0x7fffffff] Jan 21 00:57:19.043627 kernel: e820: reserve RAM buffer [mem 0x7e93f000-0x7fffffff] Jan 21 00:57:19.043637 kernel: e820: reserve RAM buffer [mem 0x7ec71000-0x7fffffff] Jan 21 00:57:19.043646 kernel: e820: reserve RAM buffer [mem 0x7f8ed000-0x7fffffff] Jan 21 00:57:19.043655 kernel: e820: reserve RAM buffer [mem 0x7feaf000-0x7fffffff] Jan 21 00:57:19.043663 kernel: e820: reserve RAM buffer [mem 0x7feec000-0x7fffffff] Jan 21 00:57:19.043773 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Jan 21 00:57:19.043871 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Jan 21 00:57:19.043971 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Jan 21 00:57:19.043982 kernel: vgaarb: loaded Jan 21 00:57:19.043991 kernel: clocksource: Switched to clocksource kvm-clock Jan 21 00:57:19.044001 kernel: VFS: Disk quotas dquot_6.6.0 Jan 21 00:57:19.044010 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Jan 21 00:57:19.044018 kernel: pnp: PnP ACPI init Jan 21 00:57:19.046219 kernel: system 00:04: [mem 0xe0000000-0xefffffff window] has been reserved Jan 21 00:57:19.046257 kernel: pnp: PnP ACPI: found 5 devices Jan 21 00:57:19.046267 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Jan 21 00:57:19.046276 kernel: NET: Registered PF_INET protocol family Jan 21 00:57:19.046286 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Jan 21 00:57:19.046296 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Jan 21 00:57:19.046306 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Jan 21 00:57:19.046315 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Jan 21 00:57:19.046325 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Jan 21 00:57:19.046334 kernel: TCP: Hash tables configured (established 32768 bind 32768) Jan 21 00:57:19.046343 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Jan 21 00:57:19.046352 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Jan 21 00:57:19.046360 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Jan 21 00:57:19.046369 kernel: NET: Registered PF_XDP protocol family Jan 21 00:57:19.046486 kernel: pci 0000:03:00.0: ROM [mem 0xfff80000-0xffffffff pref]: can't claim; no compatible bridge window Jan 21 00:57:19.046593 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Jan 21 00:57:19.047438 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Jan 21 00:57:19.047568 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Jan 21 00:57:19.047669 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Jan 21 00:57:19.047792 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Jan 21 00:57:19.047895 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Jan 21 00:57:19.048003 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Jan 21 00:57:19.048107 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 Jan 21 00:57:19.048209 kernel: pci 0000:00:03.1: bridge window [io 0x1000-0x0fff] to [bus 0b] add_size 1000 Jan 21 00:57:19.048311 kernel: pci 0000:00:03.2: bridge window [io 0x1000-0x0fff] to [bus 0c] add_size 1000 Jan 21 00:57:19.048412 kernel: pci 0000:00:03.3: bridge window [io 0x1000-0x0fff] to [bus 0d] add_size 1000 Jan 21 00:57:19.048513 kernel: pci 0000:00:03.4: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 Jan 21 00:57:19.048670 kernel: pci 0000:00:03.5: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 Jan 21 00:57:19.049095 kernel: pci 0000:00:03.6: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 Jan 21 00:57:19.049200 kernel: pci 0000:00:03.7: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 Jan 21 00:57:19.049303 kernel: pci 0000:00:04.0: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 Jan 21 00:57:19.049405 kernel: pci 0000:00:04.1: bridge window [io 0x1000-0x0fff] to [bus 13] add_size 1000 Jan 21 00:57:19.049506 kernel: pci 0000:00:04.2: bridge window [io 0x1000-0x0fff] to [bus 14] add_size 1000 Jan 21 00:57:19.049650 kernel: pci 0000:00:04.3: bridge window [io 0x1000-0x0fff] to [bus 15] add_size 1000 Jan 21 00:57:19.050522 kernel: pci 0000:00:04.4: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 Jan 21 00:57:19.050641 kernel: pci 0000:00:04.5: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 Jan 21 00:57:19.050764 kernel: pci 0000:00:04.6: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 Jan 21 00:57:19.050868 kernel: pci 0000:00:04.7: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 Jan 21 00:57:19.050969 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 Jan 21 00:57:19.051071 kernel: pci 0000:00:05.1: bridge window [io 0x1000-0x0fff] to [bus 1b] add_size 1000 Jan 21 00:57:19.051178 kernel: pci 0000:00:05.2: bridge window [io 0x1000-0x0fff] to [bus 1c] add_size 1000 Jan 21 00:57:19.051282 kernel: pci 0000:00:05.3: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 Jan 21 00:57:19.051386 kernel: pci 0000:00:05.4: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 Jan 21 00:57:19.051487 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x1fff]: assigned Jan 21 00:57:19.051589 kernel: pci 0000:00:02.2: bridge window [io 0x2000-0x2fff]: assigned Jan 21 00:57:19.051744 kernel: pci 0000:00:02.3: bridge window [io 0x3000-0x3fff]: assigned Jan 21 00:57:19.051847 kernel: pci 0000:00:02.4: bridge window [io 0x4000-0x4fff]: assigned Jan 21 00:57:19.051952 kernel: pci 0000:00:02.5: bridge window [io 0x5000-0x5fff]: assigned Jan 21 00:57:19.052053 kernel: pci 0000:00:02.6: bridge window [io 0x8000-0x8fff]: assigned Jan 21 00:57:19.052152 kernel: pci 0000:00:02.7: bridge window [io 0x9000-0x9fff]: assigned Jan 21 00:57:19.052251 kernel: pci 0000:00:03.0: bridge window [io 0xa000-0xafff]: assigned Jan 21 00:57:19.052352 kernel: pci 0000:00:03.1: bridge window [io 0xb000-0xbfff]: assigned Jan 21 00:57:19.052451 kernel: pci 0000:00:03.2: bridge window [io 0xc000-0xcfff]: assigned Jan 21 00:57:19.052558 kernel: pci 0000:00:03.3: bridge window [io 0xd000-0xdfff]: assigned Jan 21 00:57:19.052668 kernel: pci 0000:00:03.4: bridge window [io 0xe000-0xefff]: assigned Jan 21 00:57:19.053531 kernel: pci 0000:00:03.5: bridge window [io 0xf000-0xffff]: assigned Jan 21 00:57:19.053658 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: can't assign; no space Jan 21 00:57:19.053791 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: failed to assign Jan 21 00:57:19.053894 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: can't assign; no space Jan 21 00:57:19.053999 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: failed to assign Jan 21 00:57:19.054112 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: can't assign; no space Jan 21 00:57:19.054212 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: failed to assign Jan 21 00:57:19.054318 kernel: pci 0000:00:04.1: bridge window [io size 0x1000]: can't assign; no space Jan 21 00:57:19.054418 kernel: pci 0000:00:04.1: bridge window [io size 0x1000]: failed to assign Jan 21 00:57:19.054521 kernel: pci 0000:00:04.2: bridge window [io size 0x1000]: can't assign; no space Jan 21 00:57:19.054623 kernel: pci 0000:00:04.2: bridge window [io size 0x1000]: failed to assign Jan 21 00:57:19.055344 kernel: pci 0000:00:04.3: bridge window [io size 0x1000]: can't assign; no space Jan 21 00:57:19.055457 kernel: pci 0000:00:04.3: bridge window [io size 0x1000]: failed to assign Jan 21 00:57:19.055560 kernel: pci 0000:00:04.4: bridge window [io size 0x1000]: can't assign; no space Jan 21 00:57:19.055657 kernel: pci 0000:00:04.4: bridge window [io size 0x1000]: failed to assign Jan 21 00:57:19.055773 kernel: pci 0000:00:04.5: bridge window [io size 0x1000]: can't assign; no space Jan 21 00:57:19.055871 kernel: pci 0000:00:04.5: bridge window [io size 0x1000]: failed to assign Jan 21 00:57:19.055977 kernel: pci 0000:00:04.6: bridge window [io size 0x1000]: can't assign; no space Jan 21 00:57:19.056074 kernel: pci 0000:00:04.6: bridge window [io size 0x1000]: failed to assign Jan 21 00:57:19.056174 kernel: pci 0000:00:04.7: bridge window [io size 0x1000]: can't assign; no space Jan 21 00:57:19.056270 kernel: pci 0000:00:04.7: bridge window [io size 0x1000]: failed to assign Jan 21 00:57:19.056369 kernel: pci 0000:00:05.0: bridge window [io size 0x1000]: can't assign; no space Jan 21 00:57:19.056466 kernel: pci 0000:00:05.0: bridge window [io size 0x1000]: failed to assign Jan 21 00:57:19.056565 kernel: pci 0000:00:05.1: bridge window [io size 0x1000]: can't assign; no space Jan 21 00:57:19.056678 kernel: pci 0000:00:05.1: bridge window [io size 0x1000]: failed to assign Jan 21 00:57:19.056804 kernel: pci 0000:00:05.2: bridge window [io size 0x1000]: can't assign; no space Jan 21 00:57:19.056901 kernel: pci 0000:00:05.2: bridge window [io size 0x1000]: failed to assign Jan 21 00:57:19.057003 kernel: pci 0000:00:05.3: bridge window [io size 0x1000]: can't assign; no space Jan 21 00:57:19.057107 kernel: pci 0000:00:05.3: bridge window [io size 0x1000]: failed to assign Jan 21 00:57:19.057205 kernel: pci 0000:00:05.4: bridge window [io size 0x1000]: can't assign; no space Jan 21 00:57:19.057306 kernel: pci 0000:00:05.4: bridge window [io size 0x1000]: failed to assign Jan 21 00:57:19.057400 kernel: pci 0000:00:05.4: bridge window [io 0x1000-0x1fff]: assigned Jan 21 00:57:19.057496 kernel: pci 0000:00:05.3: bridge window [io 0x2000-0x2fff]: assigned Jan 21 00:57:19.057591 kernel: pci 0000:00:05.2: bridge window [io 0x3000-0x3fff]: assigned Jan 21 00:57:19.057696 kernel: pci 0000:00:05.1: bridge window [io 0x4000-0x4fff]: assigned Jan 21 00:57:19.057793 kernel: pci 0000:00:05.0: bridge window [io 0x5000-0x5fff]: assigned Jan 21 00:57:19.057891 kernel: pci 0000:00:04.7: bridge window [io 0x8000-0x8fff]: assigned Jan 21 00:57:19.057985 kernel: pci 0000:00:04.6: bridge window [io 0x9000-0x9fff]: assigned Jan 21 00:57:19.058080 kernel: pci 0000:00:04.5: bridge window [io 0xa000-0xafff]: assigned Jan 21 00:57:19.058174 kernel: pci 0000:00:04.4: bridge window [io 0xb000-0xbfff]: assigned Jan 21 00:57:19.058269 kernel: pci 0000:00:04.3: bridge window [io 0xc000-0xcfff]: assigned Jan 21 00:57:19.058368 kernel: pci 0000:00:04.2: bridge window [io 0xd000-0xdfff]: assigned Jan 21 00:57:19.058467 kernel: pci 0000:00:04.1: bridge window [io 0xe000-0xefff]: assigned Jan 21 00:57:19.058568 kernel: pci 0000:00:04.0: bridge window [io 0xf000-0xffff]: assigned Jan 21 00:57:19.058667 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: can't assign; no space Jan 21 00:57:19.058772 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: failed to assign Jan 21 00:57:19.058870 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: can't assign; no space Jan 21 00:57:19.058965 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: failed to assign Jan 21 00:57:19.059064 kernel: pci 0000:00:03.5: bridge window [io size 0x1000]: can't assign; no space Jan 21 00:57:19.059160 kernel: pci 0000:00:03.5: bridge window [io size 0x1000]: failed to assign Jan 21 00:57:19.059282 kernel: pci 0000:00:03.4: bridge window [io size 0x1000]: can't assign; no space Jan 21 00:57:19.059381 kernel: pci 0000:00:03.4: bridge window [io size 0x1000]: failed to assign Jan 21 00:57:19.059480 kernel: pci 0000:00:03.3: bridge window [io size 0x1000]: can't assign; no space Jan 21 00:57:19.059576 kernel: pci 0000:00:03.3: bridge window [io size 0x1000]: failed to assign Jan 21 00:57:19.059674 kernel: pci 0000:00:03.2: bridge window [io size 0x1000]: can't assign; no space Jan 21 00:57:19.061837 kernel: pci 0000:00:03.2: bridge window [io size 0x1000]: failed to assign Jan 21 00:57:19.061942 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: can't assign; no space Jan 21 00:57:19.062028 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: failed to assign Jan 21 00:57:19.062122 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: can't assign; no space Jan 21 00:57:19.062213 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: failed to assign Jan 21 00:57:19.062308 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: can't assign; no space Jan 21 00:57:19.062397 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: failed to assign Jan 21 00:57:19.062491 kernel: pci 0000:00:02.6: bridge window [io size 0x1000]: can't assign; no space Jan 21 00:57:19.062585 kernel: pci 0000:00:02.6: bridge window [io size 0x1000]: failed to assign Jan 21 00:57:19.064087 kernel: pci 0000:00:02.5: bridge window [io size 0x1000]: can't assign; no space Jan 21 00:57:19.064231 kernel: pci 0000:00:02.5: bridge window [io size 0x1000]: failed to assign Jan 21 00:57:19.064336 kernel: pci 0000:00:02.4: bridge window [io size 0x1000]: can't assign; no space Jan 21 00:57:19.064434 kernel: pci 0000:00:02.4: bridge window [io size 0x1000]: failed to assign Jan 21 00:57:19.064534 kernel: pci 0000:00:02.3: bridge window [io size 0x1000]: can't assign; no space Jan 21 00:57:19.064678 kernel: pci 0000:00:02.3: bridge window [io size 0x1000]: failed to assign Jan 21 00:57:19.064786 kernel: pci 0000:00:02.2: bridge window [io size 0x1000]: can't assign; no space Jan 21 00:57:19.064880 kernel: pci 0000:00:02.2: bridge window [io size 0x1000]: failed to assign Jan 21 00:57:19.064976 kernel: pci 0000:00:02.1: bridge window [io size 0x1000]: can't assign; no space Jan 21 00:57:19.065070 kernel: pci 0000:00:02.1: bridge window [io size 0x1000]: failed to assign Jan 21 00:57:19.065173 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Jan 21 00:57:19.065273 kernel: pci 0000:01:00.0: bridge window [io 0x6000-0x6fff] Jan 21 00:57:19.065369 kernel: pci 0000:01:00.0: bridge window [mem 0x84000000-0x841fffff] Jan 21 00:57:19.065465 kernel: pci 0000:01:00.0: bridge window [mem 0x380000000000-0x3807ffffffff 64bit pref] Jan 21 00:57:19.065561 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Jan 21 00:57:19.065654 kernel: pci 0000:00:02.0: bridge window [io 0x6000-0x6fff] Jan 21 00:57:19.065754 kernel: pci 0000:00:02.0: bridge window [mem 0x84000000-0x842fffff] Jan 21 00:57:19.065846 kernel: pci 0000:00:02.0: bridge window [mem 0x380000000000-0x3807ffffffff 64bit pref] Jan 21 00:57:19.065947 kernel: pci 0000:03:00.0: ROM [mem 0x83e80000-0x83efffff pref]: assigned Jan 21 00:57:19.066044 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Jan 21 00:57:19.066138 kernel: pci 0000:00:02.1: bridge window [mem 0x83e00000-0x83ffffff] Jan 21 00:57:19.066240 kernel: pci 0000:00:02.1: bridge window [mem 0x380800000000-0x380fffffffff 64bit pref] Jan 21 00:57:19.066335 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Jan 21 00:57:19.066430 kernel: pci 0000:00:02.2: bridge window [mem 0x83c00000-0x83dfffff] Jan 21 00:57:19.066522 kernel: pci 0000:00:02.2: bridge window [mem 0x381000000000-0x3817ffffffff 64bit pref] Jan 21 00:57:19.066617 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Jan 21 00:57:19.066721 kernel: pci 0000:00:02.3: bridge window [mem 0x83a00000-0x83bfffff] Jan 21 00:57:19.066817 kernel: pci 0000:00:02.3: bridge window [mem 0x381800000000-0x381fffffffff 64bit pref] Jan 21 00:57:19.066913 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Jan 21 00:57:19.067008 kernel: pci 0000:00:02.4: bridge window [mem 0x83800000-0x839fffff] Jan 21 00:57:19.067104 kernel: pci 0000:00:02.4: bridge window [mem 0x382000000000-0x3827ffffffff 64bit pref] Jan 21 00:57:19.067201 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Jan 21 00:57:19.067297 kernel: pci 0000:00:02.5: bridge window [mem 0x83600000-0x837fffff] Jan 21 00:57:19.067393 kernel: pci 0000:00:02.5: bridge window [mem 0x382800000000-0x382fffffffff 64bit pref] Jan 21 00:57:19.067498 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Jan 21 00:57:19.067593 kernel: pci 0000:00:02.6: bridge window [mem 0x83400000-0x835fffff] Jan 21 00:57:19.067707 kernel: pci 0000:00:02.6: bridge window [mem 0x383000000000-0x3837ffffffff 64bit pref] Jan 21 00:57:19.067808 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Jan 21 00:57:19.067905 kernel: pci 0000:00:02.7: bridge window [mem 0x83200000-0x833fffff] Jan 21 00:57:19.068003 kernel: pci 0000:00:02.7: bridge window [mem 0x383800000000-0x383fffffffff 64bit pref] Jan 21 00:57:19.068102 kernel: pci 0000:00:03.0: PCI bridge to [bus 0a] Jan 21 00:57:19.068198 kernel: pci 0000:00:03.0: bridge window [mem 0x83000000-0x831fffff] Jan 21 00:57:19.068293 kernel: pci 0000:00:03.0: bridge window [mem 0x384000000000-0x3847ffffffff 64bit pref] Jan 21 00:57:19.068394 kernel: pci 0000:00:03.1: PCI bridge to [bus 0b] Jan 21 00:57:19.068494 kernel: pci 0000:00:03.1: bridge window [mem 0x82e00000-0x82ffffff] Jan 21 00:57:19.068605 kernel: pci 0000:00:03.1: bridge window [mem 0x384800000000-0x384fffffffff 64bit pref] Jan 21 00:57:19.068724 kernel: pci 0000:00:03.2: PCI bridge to [bus 0c] Jan 21 00:57:19.070919 kernel: pci 0000:00:03.2: bridge window [mem 0x82c00000-0x82dfffff] Jan 21 00:57:19.071039 kernel: pci 0000:00:03.2: bridge window [mem 0x385000000000-0x3857ffffffff 64bit pref] Jan 21 00:57:19.071140 kernel: pci 0000:00:03.3: PCI bridge to [bus 0d] Jan 21 00:57:19.071239 kernel: pci 0000:00:03.3: bridge window [mem 0x82a00000-0x82bfffff] Jan 21 00:57:19.071335 kernel: pci 0000:00:03.3: bridge window [mem 0x385800000000-0x385fffffffff 64bit pref] Jan 21 00:57:19.071435 kernel: pci 0000:00:03.4: PCI bridge to [bus 0e] Jan 21 00:57:19.071531 kernel: pci 0000:00:03.4: bridge window [mem 0x82800000-0x829fffff] Jan 21 00:57:19.071636 kernel: pci 0000:00:03.4: bridge window [mem 0x386000000000-0x3867ffffffff 64bit pref] Jan 21 00:57:19.071755 kernel: pci 0000:00:03.5: PCI bridge to [bus 0f] Jan 21 00:57:19.071854 kernel: pci 0000:00:03.5: bridge window [mem 0x82600000-0x827fffff] Jan 21 00:57:19.071955 kernel: pci 0000:00:03.5: bridge window [mem 0x386800000000-0x386fffffffff 64bit pref] Jan 21 00:57:19.072058 kernel: pci 0000:00:03.6: PCI bridge to [bus 10] Jan 21 00:57:19.072156 kernel: pci 0000:00:03.6: bridge window [mem 0x82400000-0x825fffff] Jan 21 00:57:19.072252 kernel: pci 0000:00:03.6: bridge window [mem 0x387000000000-0x3877ffffffff 64bit pref] Jan 21 00:57:19.072355 kernel: pci 0000:00:03.7: PCI bridge to [bus 11] Jan 21 00:57:19.072452 kernel: pci 0000:00:03.7: bridge window [mem 0x82200000-0x823fffff] Jan 21 00:57:19.072549 kernel: pci 0000:00:03.7: bridge window [mem 0x387800000000-0x387fffffffff 64bit pref] Jan 21 00:57:19.072691 kernel: pci 0000:00:04.0: PCI bridge to [bus 12] Jan 21 00:57:19.072799 kernel: pci 0000:00:04.0: bridge window [io 0xf000-0xffff] Jan 21 00:57:19.072892 kernel: pci 0000:00:04.0: bridge window [mem 0x82000000-0x821fffff] Jan 21 00:57:19.072984 kernel: pci 0000:00:04.0: bridge window [mem 0x388000000000-0x3887ffffffff 64bit pref] Jan 21 00:57:19.073084 kernel: pci 0000:00:04.1: PCI bridge to [bus 13] Jan 21 00:57:19.073176 kernel: pci 0000:00:04.1: bridge window [io 0xe000-0xefff] Jan 21 00:57:19.073268 kernel: pci 0000:00:04.1: bridge window [mem 0x81e00000-0x81ffffff] Jan 21 00:57:19.073360 kernel: pci 0000:00:04.1: bridge window [mem 0x388800000000-0x388fffffffff 64bit pref] Jan 21 00:57:19.073458 kernel: pci 0000:00:04.2: PCI bridge to [bus 14] Jan 21 00:57:19.073554 kernel: pci 0000:00:04.2: bridge window [io 0xd000-0xdfff] Jan 21 00:57:19.073647 kernel: pci 0000:00:04.2: bridge window [mem 0x81c00000-0x81dfffff] Jan 21 00:57:19.073781 kernel: pci 0000:00:04.2: bridge window [mem 0x389000000000-0x3897ffffffff 64bit pref] Jan 21 00:57:19.073881 kernel: pci 0000:00:04.3: PCI bridge to [bus 15] Jan 21 00:57:19.073978 kernel: pci 0000:00:04.3: bridge window [io 0xc000-0xcfff] Jan 21 00:57:19.074073 kernel: pci 0000:00:04.3: bridge window [mem 0x81a00000-0x81bfffff] Jan 21 00:57:19.074169 kernel: pci 0000:00:04.3: bridge window [mem 0x389800000000-0x389fffffffff 64bit pref] Jan 21 00:57:19.074273 kernel: pci 0000:00:04.4: PCI bridge to [bus 16] Jan 21 00:57:19.074370 kernel: pci 0000:00:04.4: bridge window [io 0xb000-0xbfff] Jan 21 00:57:19.074465 kernel: pci 0000:00:04.4: bridge window [mem 0x81800000-0x819fffff] Jan 21 00:57:19.074562 kernel: pci 0000:00:04.4: bridge window [mem 0x38a000000000-0x38a7ffffffff 64bit pref] Jan 21 00:57:19.074663 kernel: pci 0000:00:04.5: PCI bridge to [bus 17] Jan 21 00:57:19.075554 kernel: pci 0000:00:04.5: bridge window [io 0xa000-0xafff] Jan 21 00:57:19.075678 kernel: pci 0000:00:04.5: bridge window [mem 0x81600000-0x817fffff] Jan 21 00:57:19.075800 kernel: pci 0000:00:04.5: bridge window [mem 0x38a800000000-0x38afffffffff 64bit pref] Jan 21 00:57:19.075904 kernel: pci 0000:00:04.6: PCI bridge to [bus 18] Jan 21 00:57:19.076000 kernel: pci 0000:00:04.6: bridge window [io 0x9000-0x9fff] Jan 21 00:57:19.076096 kernel: pci 0000:00:04.6: bridge window [mem 0x81400000-0x815fffff] Jan 21 00:57:19.076191 kernel: pci 0000:00:04.6: bridge window [mem 0x38b000000000-0x38b7ffffffff 64bit pref] Jan 21 00:57:19.076294 kernel: pci 0000:00:04.7: PCI bridge to [bus 19] Jan 21 00:57:19.076390 kernel: pci 0000:00:04.7: bridge window [io 0x8000-0x8fff] Jan 21 00:57:19.076484 kernel: pci 0000:00:04.7: bridge window [mem 0x81200000-0x813fffff] Jan 21 00:57:19.076588 kernel: pci 0000:00:04.7: bridge window [mem 0x38b800000000-0x38bfffffffff 64bit pref] Jan 21 00:57:19.076698 kernel: pci 0000:00:05.0: PCI bridge to [bus 1a] Jan 21 00:57:19.076793 kernel: pci 0000:00:05.0: bridge window [io 0x5000-0x5fff] Jan 21 00:57:19.076890 kernel: pci 0000:00:05.0: bridge window [mem 0x81000000-0x811fffff] Jan 21 00:57:19.076985 kernel: pci 0000:00:05.0: bridge window [mem 0x38c000000000-0x38c7ffffffff 64bit pref] Jan 21 00:57:19.077084 kernel: pci 0000:00:05.1: PCI bridge to [bus 1b] Jan 21 00:57:19.077179 kernel: pci 0000:00:05.1: bridge window [io 0x4000-0x4fff] Jan 21 00:57:19.077273 kernel: pci 0000:00:05.1: bridge window [mem 0x80e00000-0x80ffffff] Jan 21 00:57:19.077368 kernel: pci 0000:00:05.1: bridge window [mem 0x38c800000000-0x38cfffffffff 64bit pref] Jan 21 00:57:19.077470 kernel: pci 0000:00:05.2: PCI bridge to [bus 1c] Jan 21 00:57:19.077565 kernel: pci 0000:00:05.2: bridge window [io 0x3000-0x3fff] Jan 21 00:57:19.077659 kernel: pci 0000:00:05.2: bridge window [mem 0x80c00000-0x80dfffff] Jan 21 00:57:19.078065 kernel: pci 0000:00:05.2: bridge window [mem 0x38d000000000-0x38d7ffffffff 64bit pref] Jan 21 00:57:19.078166 kernel: pci 0000:00:05.3: PCI bridge to [bus 1d] Jan 21 00:57:19.078264 kernel: pci 0000:00:05.3: bridge window [io 0x2000-0x2fff] Jan 21 00:57:19.078362 kernel: pci 0000:00:05.3: bridge window [mem 0x80a00000-0x80bfffff] Jan 21 00:57:19.078456 kernel: pci 0000:00:05.3: bridge window [mem 0x38d800000000-0x38dfffffffff 64bit pref] Jan 21 00:57:19.078555 kernel: pci 0000:00:05.4: PCI bridge to [bus 1e] Jan 21 00:57:19.078650 kernel: pci 0000:00:05.4: bridge window [io 0x1000-0x1fff] Jan 21 00:57:19.078754 kernel: pci 0000:00:05.4: bridge window [mem 0x80800000-0x809fffff] Jan 21 00:57:19.078850 kernel: pci 0000:00:05.4: bridge window [mem 0x38e000000000-0x38e7ffffffff 64bit pref] Jan 21 00:57:19.078954 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Jan 21 00:57:19.079045 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Jan 21 00:57:19.079133 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Jan 21 00:57:19.079220 kernel: pci_bus 0000:00: resource 7 [mem 0x80000000-0xdfffffff window] Jan 21 00:57:19.079305 kernel: pci_bus 0000:00: resource 8 [mem 0xf0000000-0xfebfffff window] Jan 21 00:57:19.079391 kernel: pci_bus 0000:00: resource 9 [mem 0x380000000000-0x38e800003fff window] Jan 21 00:57:19.079496 kernel: pci_bus 0000:01: resource 0 [io 0x6000-0x6fff] Jan 21 00:57:19.079587 kernel: pci_bus 0000:01: resource 1 [mem 0x84000000-0x842fffff] Jan 21 00:57:19.079676 kernel: pci_bus 0000:01: resource 2 [mem 0x380000000000-0x3807ffffffff 64bit pref] Jan 21 00:57:19.079792 kernel: pci_bus 0000:02: resource 0 [io 0x6000-0x6fff] Jan 21 00:57:19.079885 kernel: pci_bus 0000:02: resource 1 [mem 0x84000000-0x841fffff] Jan 21 00:57:19.079977 kernel: pci_bus 0000:02: resource 2 [mem 0x380000000000-0x3807ffffffff 64bit pref] Jan 21 00:57:19.080080 kernel: pci_bus 0000:03: resource 1 [mem 0x83e00000-0x83ffffff] Jan 21 00:57:19.080171 kernel: pci_bus 0000:03: resource 2 [mem 0x380800000000-0x380fffffffff 64bit pref] Jan 21 00:57:19.080269 kernel: pci_bus 0000:04: resource 1 [mem 0x83c00000-0x83dfffff] Jan 21 00:57:19.080358 kernel: pci_bus 0000:04: resource 2 [mem 0x381000000000-0x3817ffffffff 64bit pref] Jan 21 00:57:19.080463 kernel: pci_bus 0000:05: resource 1 [mem 0x83a00000-0x83bfffff] Jan 21 00:57:19.080556 kernel: pci_bus 0000:05: resource 2 [mem 0x381800000000-0x381fffffffff 64bit pref] Jan 21 00:57:19.080667 kernel: pci_bus 0000:06: resource 1 [mem 0x83800000-0x839fffff] Jan 21 00:57:19.080764 kernel: pci_bus 0000:06: resource 2 [mem 0x382000000000-0x3827ffffffff 64bit pref] Jan 21 00:57:19.080860 kernel: pci_bus 0000:07: resource 1 [mem 0x83600000-0x837fffff] Jan 21 00:57:19.080947 kernel: pci_bus 0000:07: resource 2 [mem 0x382800000000-0x382fffffffff 64bit pref] Jan 21 00:57:19.081048 kernel: pci_bus 0000:08: resource 1 [mem 0x83400000-0x835fffff] Jan 21 00:57:19.081136 kernel: pci_bus 0000:08: resource 2 [mem 0x383000000000-0x3837ffffffff 64bit pref] Jan 21 00:57:19.081230 kernel: pci_bus 0000:09: resource 1 [mem 0x83200000-0x833fffff] Jan 21 00:57:19.081316 kernel: pci_bus 0000:09: resource 2 [mem 0x383800000000-0x383fffffffff 64bit pref] Jan 21 00:57:19.081414 kernel: pci_bus 0000:0a: resource 1 [mem 0x83000000-0x831fffff] Jan 21 00:57:19.081505 kernel: pci_bus 0000:0a: resource 2 [mem 0x384000000000-0x3847ffffffff 64bit pref] Jan 21 00:57:19.081600 kernel: pci_bus 0000:0b: resource 1 [mem 0x82e00000-0x82ffffff] Jan 21 00:57:19.081697 kernel: pci_bus 0000:0b: resource 2 [mem 0x384800000000-0x384fffffffff 64bit pref] Jan 21 00:57:19.081795 kernel: pci_bus 0000:0c: resource 1 [mem 0x82c00000-0x82dfffff] Jan 21 00:57:19.081886 kernel: pci_bus 0000:0c: resource 2 [mem 0x385000000000-0x3857ffffffff 64bit pref] Jan 21 00:57:19.081986 kernel: pci_bus 0000:0d: resource 1 [mem 0x82a00000-0x82bfffff] Jan 21 00:57:19.082075 kernel: pci_bus 0000:0d: resource 2 [mem 0x385800000000-0x385fffffffff 64bit pref] Jan 21 00:57:19.082177 kernel: pci_bus 0000:0e: resource 1 [mem 0x82800000-0x829fffff] Jan 21 00:57:19.082268 kernel: pci_bus 0000:0e: resource 2 [mem 0x386000000000-0x3867ffffffff 64bit pref] Jan 21 00:57:19.082364 kernel: pci_bus 0000:0f: resource 1 [mem 0x82600000-0x827fffff] Jan 21 00:57:19.082457 kernel: pci_bus 0000:0f: resource 2 [mem 0x386800000000-0x386fffffffff 64bit pref] Jan 21 00:57:19.082554 kernel: pci_bus 0000:10: resource 1 [mem 0x82400000-0x825fffff] Jan 21 00:57:19.082644 kernel: pci_bus 0000:10: resource 2 [mem 0x387000000000-0x3877ffffffff 64bit pref] Jan 21 00:57:19.082750 kernel: pci_bus 0000:11: resource 1 [mem 0x82200000-0x823fffff] Jan 21 00:57:19.082840 kernel: pci_bus 0000:11: resource 2 [mem 0x387800000000-0x387fffffffff 64bit pref] Jan 21 00:57:19.082947 kernel: pci_bus 0000:12: resource 0 [io 0xf000-0xffff] Jan 21 00:57:19.083036 kernel: pci_bus 0000:12: resource 1 [mem 0x82000000-0x821fffff] Jan 21 00:57:19.083127 kernel: pci_bus 0000:12: resource 2 [mem 0x388000000000-0x3887ffffffff 64bit pref] Jan 21 00:57:19.083225 kernel: pci_bus 0000:13: resource 0 [io 0xe000-0xefff] Jan 21 00:57:19.083314 kernel: pci_bus 0000:13: resource 1 [mem 0x81e00000-0x81ffffff] Jan 21 00:57:19.083403 kernel: pci_bus 0000:13: resource 2 [mem 0x388800000000-0x388fffffffff 64bit pref] Jan 21 00:57:19.083502 kernel: pci_bus 0000:14: resource 0 [io 0xd000-0xdfff] Jan 21 00:57:19.083591 kernel: pci_bus 0000:14: resource 1 [mem 0x81c00000-0x81dfffff] Jan 21 00:57:19.083685 kernel: pci_bus 0000:14: resource 2 [mem 0x389000000000-0x3897ffffffff 64bit pref] Jan 21 00:57:19.083784 kernel: pci_bus 0000:15: resource 0 [io 0xc000-0xcfff] Jan 21 00:57:19.083874 kernel: pci_bus 0000:15: resource 1 [mem 0x81a00000-0x81bfffff] Jan 21 00:57:19.083963 kernel: pci_bus 0000:15: resource 2 [mem 0x389800000000-0x389fffffffff 64bit pref] Jan 21 00:57:19.084064 kernel: pci_bus 0000:16: resource 0 [io 0xb000-0xbfff] Jan 21 00:57:19.084155 kernel: pci_bus 0000:16: resource 1 [mem 0x81800000-0x819fffff] Jan 21 00:57:19.084243 kernel: pci_bus 0000:16: resource 2 [mem 0x38a000000000-0x38a7ffffffff 64bit pref] Jan 21 00:57:19.084342 kernel: pci_bus 0000:17: resource 0 [io 0xa000-0xafff] Jan 21 00:57:19.084432 kernel: pci_bus 0000:17: resource 1 [mem 0x81600000-0x817fffff] Jan 21 00:57:19.084523 kernel: pci_bus 0000:17: resource 2 [mem 0x38a800000000-0x38afffffffff 64bit pref] Jan 21 00:57:19.084627 kernel: pci_bus 0000:18: resource 0 [io 0x9000-0x9fff] Jan 21 00:57:19.084735 kernel: pci_bus 0000:18: resource 1 [mem 0x81400000-0x815fffff] Jan 21 00:57:19.084825 kernel: pci_bus 0000:18: resource 2 [mem 0x38b000000000-0x38b7ffffffff 64bit pref] Jan 21 00:57:19.084921 kernel: pci_bus 0000:19: resource 0 [io 0x8000-0x8fff] Jan 21 00:57:19.085011 kernel: pci_bus 0000:19: resource 1 [mem 0x81200000-0x813fffff] Jan 21 00:57:19.085104 kernel: pci_bus 0000:19: resource 2 [mem 0x38b800000000-0x38bfffffffff 64bit pref] Jan 21 00:57:19.085199 kernel: pci_bus 0000:1a: resource 0 [io 0x5000-0x5fff] Jan 21 00:57:19.085289 kernel: pci_bus 0000:1a: resource 1 [mem 0x81000000-0x811fffff] Jan 21 00:57:19.085377 kernel: pci_bus 0000:1a: resource 2 [mem 0x38c000000000-0x38c7ffffffff 64bit pref] Jan 21 00:57:19.085476 kernel: pci_bus 0000:1b: resource 0 [io 0x4000-0x4fff] Jan 21 00:57:19.085569 kernel: pci_bus 0000:1b: resource 1 [mem 0x80e00000-0x80ffffff] Jan 21 00:57:19.085657 kernel: pci_bus 0000:1b: resource 2 [mem 0x38c800000000-0x38cfffffffff 64bit pref] Jan 21 00:57:19.085765 kernel: pci_bus 0000:1c: resource 0 [io 0x3000-0x3fff] Jan 21 00:57:19.085853 kernel: pci_bus 0000:1c: resource 1 [mem 0x80c00000-0x80dfffff] Jan 21 00:57:19.085941 kernel: pci_bus 0000:1c: resource 2 [mem 0x38d000000000-0x38d7ffffffff 64bit pref] Jan 21 00:57:19.086038 kernel: pci_bus 0000:1d: resource 0 [io 0x2000-0x2fff] Jan 21 00:57:19.086131 kernel: pci_bus 0000:1d: resource 1 [mem 0x80a00000-0x80bfffff] Jan 21 00:57:19.086219 kernel: pci_bus 0000:1d: resource 2 [mem 0x38d800000000-0x38dfffffffff 64bit pref] Jan 21 00:57:19.086315 kernel: pci_bus 0000:1e: resource 0 [io 0x1000-0x1fff] Jan 21 00:57:19.086405 kernel: pci_bus 0000:1e: resource 1 [mem 0x80800000-0x809fffff] Jan 21 00:57:19.086494 kernel: pci_bus 0000:1e: resource 2 [mem 0x38e000000000-0x38e7ffffffff 64bit pref] Jan 21 00:57:19.086506 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Jan 21 00:57:19.086518 kernel: PCI: CLS 0 bytes, default 64 Jan 21 00:57:19.086527 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Jan 21 00:57:19.086536 kernel: software IO TLB: mapped [mem 0x0000000077ede000-0x000000007bede000] (64MB) Jan 21 00:57:19.086544 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Jan 21 00:57:19.086553 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x21134f58f0d, max_idle_ns: 440795217993 ns Jan 21 00:57:19.086563 kernel: Initialise system trusted keyrings Jan 21 00:57:19.086572 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Jan 21 00:57:19.086583 kernel: Key type asymmetric registered Jan 21 00:57:19.086591 kernel: Asymmetric key parser 'x509' registered Jan 21 00:57:19.086600 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Jan 21 00:57:19.086609 kernel: io scheduler mq-deadline registered Jan 21 00:57:19.086617 kernel: io scheduler kyber registered Jan 21 00:57:19.086626 kernel: io scheduler bfq registered Jan 21 00:57:19.086736 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24 Jan 21 00:57:19.086839 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24 Jan 21 00:57:19.086940 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25 Jan 21 00:57:19.087038 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25 Jan 21 00:57:19.087138 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26 Jan 21 00:57:19.087235 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26 Jan 21 00:57:19.087336 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27 Jan 21 00:57:19.087433 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27 Jan 21 00:57:19.087532 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28 Jan 21 00:57:19.087628 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28 Jan 21 00:57:19.087736 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29 Jan 21 00:57:19.087836 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29 Jan 21 00:57:19.087934 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30 Jan 21 00:57:19.088030 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30 Jan 21 00:57:19.088126 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31 Jan 21 00:57:19.088224 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31 Jan 21 00:57:19.088237 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Jan 21 00:57:19.088334 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 32 Jan 21 00:57:19.088431 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 32 Jan 21 00:57:19.088531 kernel: pcieport 0000:00:03.1: PME: Signaling with IRQ 33 Jan 21 00:57:19.088636 kernel: pcieport 0000:00:03.1: AER: enabled with IRQ 33 Jan 21 00:57:19.088755 kernel: pcieport 0000:00:03.2: PME: Signaling with IRQ 34 Jan 21 00:57:19.088852 kernel: pcieport 0000:00:03.2: AER: enabled with IRQ 34 Jan 21 00:57:19.088947 kernel: pcieport 0000:00:03.3: PME: Signaling with IRQ 35 Jan 21 00:57:19.089041 kernel: pcieport 0000:00:03.3: AER: enabled with IRQ 35 Jan 21 00:57:19.089135 kernel: pcieport 0000:00:03.4: PME: Signaling with IRQ 36 Jan 21 00:57:19.089231 kernel: pcieport 0000:00:03.4: AER: enabled with IRQ 36 Jan 21 00:57:19.089327 kernel: pcieport 0000:00:03.5: PME: Signaling with IRQ 37 Jan 21 00:57:19.089420 kernel: pcieport 0000:00:03.5: AER: enabled with IRQ 37 Jan 21 00:57:19.089515 kernel: pcieport 0000:00:03.6: PME: Signaling with IRQ 38 Jan 21 00:57:19.089608 kernel: pcieport 0000:00:03.6: AER: enabled with IRQ 38 Jan 21 00:57:19.090217 kernel: pcieport 0000:00:03.7: PME: Signaling with IRQ 39 Jan 21 00:57:19.090348 kernel: pcieport 0000:00:03.7: AER: enabled with IRQ 39 Jan 21 00:57:19.090360 kernel: ACPI: \_SB_.GSIE: Enabled at IRQ 20 Jan 21 00:57:19.090463 kernel: pcieport 0000:00:04.0: PME: Signaling with IRQ 40 Jan 21 00:57:19.090565 kernel: pcieport 0000:00:04.0: AER: enabled with IRQ 40 Jan 21 00:57:19.090669 kernel: pcieport 0000:00:04.1: PME: Signaling with IRQ 41 Jan 21 00:57:19.092709 kernel: pcieport 0000:00:04.1: AER: enabled with IRQ 41 Jan 21 00:57:19.092829 kernel: pcieport 0000:00:04.2: PME: Signaling with IRQ 42 Jan 21 00:57:19.092927 kernel: pcieport 0000:00:04.2: AER: enabled with IRQ 42 Jan 21 00:57:19.093026 kernel: pcieport 0000:00:04.3: PME: Signaling with IRQ 43 Jan 21 00:57:19.093123 kernel: pcieport 0000:00:04.3: AER: enabled with IRQ 43 Jan 21 00:57:19.093222 kernel: pcieport 0000:00:04.4: PME: Signaling with IRQ 44 Jan 21 00:57:19.093319 kernel: pcieport 0000:00:04.4: AER: enabled with IRQ 44 Jan 21 00:57:19.093422 kernel: pcieport 0000:00:04.5: PME: Signaling with IRQ 45 Jan 21 00:57:19.093517 kernel: pcieport 0000:00:04.5: AER: enabled with IRQ 45 Jan 21 00:57:19.093617 kernel: pcieport 0000:00:04.6: PME: Signaling with IRQ 46 Jan 21 00:57:19.093743 kernel: pcieport 0000:00:04.6: AER: enabled with IRQ 46 Jan 21 00:57:19.093849 kernel: pcieport 0000:00:04.7: PME: Signaling with IRQ 47 Jan 21 00:57:19.093955 kernel: pcieport 0000:00:04.7: AER: enabled with IRQ 47 Jan 21 00:57:19.093967 kernel: ACPI: \_SB_.GSIF: Enabled at IRQ 21 Jan 21 00:57:19.094077 kernel: pcieport 0000:00:05.0: PME: Signaling with IRQ 48 Jan 21 00:57:19.094180 kernel: pcieport 0000:00:05.0: AER: enabled with IRQ 48 Jan 21 00:57:19.094282 kernel: pcieport 0000:00:05.1: PME: Signaling with IRQ 49 Jan 21 00:57:19.094383 kernel: pcieport 0000:00:05.1: AER: enabled with IRQ 49 Jan 21 00:57:19.094484 kernel: pcieport 0000:00:05.2: PME: Signaling with IRQ 50 Jan 21 00:57:19.094581 kernel: pcieport 0000:00:05.2: AER: enabled with IRQ 50 Jan 21 00:57:19.094694 kernel: pcieport 0000:00:05.3: PME: Signaling with IRQ 51 Jan 21 00:57:19.094798 kernel: pcieport 0000:00:05.3: AER: enabled with IRQ 51 Jan 21 00:57:19.094897 kernel: pcieport 0000:00:05.4: PME: Signaling with IRQ 52 Jan 21 00:57:19.094994 kernel: pcieport 0000:00:05.4: AER: enabled with IRQ 52 Jan 21 00:57:19.095005 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Jan 21 00:57:19.095014 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Jan 21 00:57:19.095023 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Jan 21 00:57:19.095035 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Jan 21 00:57:19.095044 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Jan 21 00:57:19.095053 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Jan 21 00:57:19.095163 kernel: rtc_cmos 00:03: RTC can wake from S4 Jan 21 00:57:19.095177 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Jan 21 00:57:19.095268 kernel: rtc_cmos 00:03: registered as rtc0 Jan 21 00:57:19.095362 kernel: rtc_cmos 00:03: setting system clock to 2026-01-21T00:57:17 UTC (1768957037) Jan 21 00:57:19.095456 kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram Jan 21 00:57:19.095467 kernel: intel_pstate: CPU model not supported Jan 21 00:57:19.095476 kernel: efifb: probing for efifb Jan 21 00:57:19.095485 kernel: efifb: framebuffer at 0x80000000, using 4000k, total 4000k Jan 21 00:57:19.095494 kernel: efifb: mode is 1280x800x32, linelength=5120, pages=1 Jan 21 00:57:19.095502 kernel: efifb: scrolling: redraw Jan 21 00:57:19.095511 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 Jan 21 00:57:19.095522 kernel: Console: switching to colour frame buffer device 160x50 Jan 21 00:57:19.095531 kernel: fb0: EFI VGA frame buffer device Jan 21 00:57:19.095540 kernel: pstore: Using crash dump compression: deflate Jan 21 00:57:19.095549 kernel: pstore: Registered efi_pstore as persistent store backend Jan 21 00:57:19.095557 kernel: NET: Registered PF_INET6 protocol family Jan 21 00:57:19.095566 kernel: Segment Routing with IPv6 Jan 21 00:57:19.095575 kernel: In-situ OAM (IOAM) with IPv6 Jan 21 00:57:19.095586 kernel: NET: Registered PF_PACKET protocol family Jan 21 00:57:19.095594 kernel: Key type dns_resolver registered Jan 21 00:57:19.095603 kernel: IPI shorthand broadcast: enabled Jan 21 00:57:19.095612 kernel: sched_clock: Marking stable (2534001744, 156890573)->(2796264908, -105372591) Jan 21 00:57:19.095621 kernel: registered taskstats version 1 Jan 21 00:57:19.095630 kernel: Loading compiled-in X.509 certificates Jan 21 00:57:19.095639 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.66-flatcar: 169e95345ec0c7da7389f5f6d7b9c06dfd352178' Jan 21 00:57:19.095650 kernel: Demotion targets for Node 0: null Jan 21 00:57:19.095659 kernel: Key type .fscrypt registered Jan 21 00:57:19.095668 kernel: Key type fscrypt-provisioning registered Jan 21 00:57:19.095676 kernel: ima: No TPM chip found, activating TPM-bypass! Jan 21 00:57:19.095695 kernel: ima: Allocated hash algorithm: sha1 Jan 21 00:57:19.095703 kernel: ima: No architecture policies found Jan 21 00:57:19.095712 kernel: clk: Disabling unused clocks Jan 21 00:57:19.095723 kernel: Freeing unused kernel image (initmem) memory: 15532K Jan 21 00:57:19.095731 kernel: Write protecting the kernel read-only data: 47104k Jan 21 00:57:19.095740 kernel: Freeing unused kernel image (rodata/data gap) memory: 1132K Jan 21 00:57:19.095749 kernel: Run /init as init process Jan 21 00:57:19.095758 kernel: with arguments: Jan 21 00:57:19.095768 kernel: /init Jan 21 00:57:19.095776 kernel: with environment: Jan 21 00:57:19.095785 kernel: HOME=/ Jan 21 00:57:19.095795 kernel: TERM=linux Jan 21 00:57:19.095804 kernel: SCSI subsystem initialized Jan 21 00:57:19.095812 kernel: libata version 3.00 loaded. Jan 21 00:57:19.095918 kernel: ahci 0000:00:1f.2: version 3.0 Jan 21 00:57:19.095931 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Jan 21 00:57:19.096028 kernel: ahci 0000:00:1f.2: AHCI vers 0001.0000, 32 command slots, 1.5 Gbps, SATA mode Jan 21 00:57:19.096129 kernel: ahci 0000:00:1f.2: 6/6 ports implemented (port mask 0x3f) Jan 21 00:57:19.096226 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Jan 21 00:57:19.096343 kernel: scsi host0: ahci Jan 21 00:57:19.096458 kernel: scsi host1: ahci Jan 21 00:57:19.096590 kernel: scsi host2: ahci Jan 21 00:57:19.097893 kernel: scsi host3: ahci Jan 21 00:57:19.098062 kernel: scsi host4: ahci Jan 21 00:57:19.098174 kernel: scsi host5: ahci Jan 21 00:57:19.098187 kernel: ata1: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380100 irq 55 lpm-pol 1 Jan 21 00:57:19.098200 kernel: ata2: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380180 irq 55 lpm-pol 1 Jan 21 00:57:19.098209 kernel: ata3: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380200 irq 55 lpm-pol 1 Jan 21 00:57:19.098219 kernel: ata4: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380280 irq 55 lpm-pol 1 Jan 21 00:57:19.098230 kernel: ata5: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380300 irq 55 lpm-pol 1 Jan 21 00:57:19.098240 kernel: ata6: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380380 irq 55 lpm-pol 1 Jan 21 00:57:19.098249 kernel: ata3: SATA link down (SStatus 0 SControl 300) Jan 21 00:57:19.098258 kernel: ata6: SATA link down (SStatus 0 SControl 300) Jan 21 00:57:19.098267 kernel: ata1: SATA link down (SStatus 0 SControl 300) Jan 21 00:57:19.098277 kernel: ata4: SATA link down (SStatus 0 SControl 300) Jan 21 00:57:19.098286 kernel: ata5: SATA link down (SStatus 0 SControl 300) Jan 21 00:57:19.098296 kernel: ata2: SATA link down (SStatus 0 SControl 300) Jan 21 00:57:19.098306 kernel: ACPI: bus type USB registered Jan 21 00:57:19.098314 kernel: usbcore: registered new interface driver usbfs Jan 21 00:57:19.098323 kernel: usbcore: registered new interface driver hub Jan 21 00:57:19.098332 kernel: usbcore: registered new device driver usb Jan 21 00:57:19.098450 kernel: uhci_hcd 0000:02:01.0: UHCI Host Controller Jan 21 00:57:19.098557 kernel: uhci_hcd 0000:02:01.0: new USB bus registered, assigned bus number 1 Jan 21 00:57:19.098664 kernel: uhci_hcd 0000:02:01.0: detected 2 ports Jan 21 00:57:19.098785 kernel: uhci_hcd 0000:02:01.0: irq 22, io port 0x00006000 Jan 21 00:57:19.098914 kernel: hub 1-0:1.0: USB hub found Jan 21 00:57:19.099020 kernel: hub 1-0:1.0: 2 ports detected Jan 21 00:57:19.099128 kernel: virtio_blk virtio2: 2/0/0 default/read/poll queues Jan 21 00:57:19.099224 kernel: virtio_blk virtio2: [vda] 104857600 512-byte logical blocks (53.7 GB/50.0 GiB) Jan 21 00:57:19.099236 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Jan 21 00:57:19.099245 kernel: GPT:25804799 != 104857599 Jan 21 00:57:19.099254 kernel: GPT:Alternate GPT header not at the end of the disk. Jan 21 00:57:19.099263 kernel: GPT:25804799 != 104857599 Jan 21 00:57:19.099271 kernel: GPT: Use GNU Parted to correct GPT errors. Jan 21 00:57:19.099279 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Jan 21 00:57:19.099290 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Jan 21 00:57:19.099299 kernel: device-mapper: uevent: version 1.0.3 Jan 21 00:57:19.099308 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Jan 21 00:57:19.099316 kernel: device-mapper: verity: sha256 using shash "sha256-generic" Jan 21 00:57:19.099325 kernel: raid6: avx512x4 gen() 44110 MB/s Jan 21 00:57:19.099333 kernel: raid6: avx512x2 gen() 45358 MB/s Jan 21 00:57:19.099341 kernel: raid6: avx512x1 gen() 43861 MB/s Jan 21 00:57:19.099352 kernel: raid6: avx2x4 gen() 34372 MB/s Jan 21 00:57:19.099360 kernel: raid6: avx2x2 gen() 34255 MB/s Jan 21 00:57:19.099368 kernel: raid6: avx2x1 gen() 30635 MB/s Jan 21 00:57:19.099377 kernel: raid6: using algorithm avx512x2 gen() 45358 MB/s Jan 21 00:57:19.099385 kernel: raid6: .... xor() 27071 MB/s, rmw enabled Jan 21 00:57:19.099396 kernel: raid6: using avx512x2 recovery algorithm Jan 21 00:57:19.099406 kernel: xor: automatically using best checksumming function avx Jan 21 00:57:19.099414 kernel: Btrfs loaded, zoned=no, fsverity=no Jan 21 00:57:19.099423 kernel: BTRFS: device fsid 1d50d7f2-b244-4434-b37e-796fa0c23345 devid 1 transid 39 /dev/mapper/usr (253:0) scanned by mount (205) Jan 21 00:57:19.099431 kernel: BTRFS info (device dm-0): first mount of filesystem 1d50d7f2-b244-4434-b37e-796fa0c23345 Jan 21 00:57:19.099440 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Jan 21 00:57:19.099558 kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd Jan 21 00:57:19.099571 kernel: BTRFS info (device dm-0): disabling log replay at mount time Jan 21 00:57:19.099581 kernel: BTRFS info (device dm-0): enabling free space tree Jan 21 00:57:19.099590 kernel: loop: module loaded Jan 21 00:57:19.099613 kernel: loop0: detected capacity change from 0 to 100552 Jan 21 00:57:19.099621 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Jan 21 00:57:19.099631 systemd[1]: Successfully made /usr/ read-only. Jan 21 00:57:19.099644 systemd[1]: systemd 257.9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +IPE +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -BTF -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jan 21 00:57:19.099656 systemd[1]: Detected virtualization kvm. Jan 21 00:57:19.099664 systemd[1]: Detected architecture x86-64. Jan 21 00:57:19.099673 systemd[1]: Running in initrd. Jan 21 00:57:19.099711 systemd[1]: No hostname configured, using default hostname. Jan 21 00:57:19.099720 systemd[1]: Hostname set to . Jan 21 00:57:19.099729 systemd[1]: Initializing machine ID from SMBIOS/DMI UUID. Jan 21 00:57:19.099742 systemd[1]: Queued start job for default target initrd.target. Jan 21 00:57:19.099750 systemd[1]: Unnecessary job was removed for dev-mapper-usr.device - /dev/mapper/usr. Jan 21 00:57:19.099759 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 21 00:57:19.099768 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 21 00:57:19.099778 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Jan 21 00:57:19.099787 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 21 00:57:19.099798 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Jan 21 00:57:19.099807 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Jan 21 00:57:19.099816 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 21 00:57:19.099825 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 21 00:57:19.099834 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Jan 21 00:57:19.099843 systemd[1]: Reached target paths.target - Path Units. Jan 21 00:57:19.099854 systemd[1]: Reached target slices.target - Slice Units. Jan 21 00:57:19.099862 systemd[1]: Reached target swap.target - Swaps. Jan 21 00:57:19.099871 systemd[1]: Reached target timers.target - Timer Units. Jan 21 00:57:19.099880 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Jan 21 00:57:19.099889 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 21 00:57:19.099898 systemd[1]: Listening on systemd-journald-audit.socket - Journal Audit Socket. Jan 21 00:57:19.099907 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Jan 21 00:57:19.099917 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Jan 21 00:57:19.099926 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 21 00:57:19.099935 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 21 00:57:19.099944 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 21 00:57:19.099953 systemd[1]: Reached target sockets.target - Socket Units. Jan 21 00:57:19.099963 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Jan 21 00:57:19.099971 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Jan 21 00:57:19.099982 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 21 00:57:19.099991 systemd[1]: Finished network-cleanup.service - Network Cleanup. Jan 21 00:57:19.100001 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Jan 21 00:57:19.100009 systemd[1]: Starting systemd-fsck-usr.service... Jan 21 00:57:19.100018 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 21 00:57:19.100027 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 21 00:57:19.100038 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 21 00:57:19.100048 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Jan 21 00:57:19.100056 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 21 00:57:19.100065 systemd[1]: Finished systemd-fsck-usr.service. Jan 21 00:57:19.100076 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jan 21 00:57:19.100109 systemd-journald[343]: Collecting audit messages is enabled. Jan 21 00:57:19.100131 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 21 00:57:19.100143 kernel: audit: type=1130 audit(1768957039.034:2): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:19.100152 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Jan 21 00:57:19.100160 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 21 00:57:19.100169 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 21 00:57:19.100178 kernel: Bridge firewalling registered Jan 21 00:57:19.100187 kernel: audit: type=1130 audit(1768957039.054:3): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:19.100198 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 21 00:57:19.100207 kernel: audit: type=1130 audit(1768957039.062:4): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:19.100216 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 21 00:57:19.100224 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 21 00:57:19.100233 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 21 00:57:19.100243 systemd-journald[343]: Journal started Jan 21 00:57:19.100265 systemd-journald[343]: Runtime Journal (/run/log/journal/946d775c848e45838c905fb1affa53e4) is 8M, max 77.9M, 69.9M free. Jan 21 00:57:19.104480 kernel: audit: type=1130 audit(1768957039.100:5): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:19.034000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:19.054000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:19.062000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:19.100000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:19.054050 systemd-modules-load[345]: Inserted module 'br_netfilter' Jan 21 00:57:19.107697 systemd[1]: Started systemd-journald.service - Journal Service. Jan 21 00:57:19.114700 kernel: audit: type=1130 audit(1768957039.108:6): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:19.108000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:19.117961 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 21 00:57:19.119072 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 21 00:57:19.120000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:19.122121 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 21 00:57:19.130639 kernel: audit: type=1130 audit(1768957039.120:7): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:19.130670 kernel: audit: type=1130 audit(1768957039.125:8): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:19.125000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:19.133824 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Jan 21 00:57:19.135000 audit: BPF prog-id=6 op=LOAD Jan 21 00:57:19.137701 kernel: audit: type=1334 audit(1768957039.135:9): prog-id=6 op=LOAD Jan 21 00:57:19.138952 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 21 00:57:19.144357 systemd-tmpfiles[371]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Jan 21 00:57:19.149706 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 21 00:57:19.150000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:19.155715 kernel: audit: type=1130 audit(1768957039.150:10): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:19.166705 dracut-cmdline[377]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack verity.usrhash=febd26d0ecadb4f9abb44f6b2a89e793f13258cbb011a4bfe78289e5448c772a Jan 21 00:57:19.192352 systemd-resolved[378]: Positive Trust Anchors: Jan 21 00:57:19.193249 systemd-resolved[378]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 21 00:57:19.193857 systemd-resolved[378]: . IN DS 38696 8 2 683d2d0acb8c9b712a1948b27f741219298d0a450d612c483af444a4c0fb2b16 Jan 21 00:57:19.193890 systemd-resolved[378]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 21 00:57:19.221635 systemd-resolved[378]: Defaulting to hostname 'linux'. Jan 21 00:57:19.223384 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 21 00:57:19.223000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:19.224677 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 21 00:57:19.253751 kernel: Loading iSCSI transport class v2.0-870. Jan 21 00:57:19.270716 kernel: iscsi: registered transport (tcp) Jan 21 00:57:19.295784 kernel: iscsi: registered transport (qla4xxx) Jan 21 00:57:19.295877 kernel: QLogic iSCSI HBA Driver Jan 21 00:57:19.324986 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jan 21 00:57:19.342779 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jan 21 00:57:19.343000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:19.346356 systemd[1]: Reached target network-pre.target - Preparation for Network. Jan 21 00:57:19.388659 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Jan 21 00:57:19.388000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:19.390908 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Jan 21 00:57:19.394812 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Jan 21 00:57:19.422439 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Jan 21 00:57:19.422000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:19.423000 audit: BPF prog-id=7 op=LOAD Jan 21 00:57:19.423000 audit: BPF prog-id=8 op=LOAD Jan 21 00:57:19.425020 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 21 00:57:19.452246 systemd-udevd[621]: Using default interface naming scheme 'v257'. Jan 21 00:57:19.461302 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 21 00:57:19.462000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:19.464272 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Jan 21 00:57:19.487415 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 21 00:57:19.487000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:19.488000 audit: BPF prog-id=9 op=LOAD Jan 21 00:57:19.490861 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 21 00:57:19.494283 dracut-pre-trigger[693]: rd.md=0: removing MD RAID activation Jan 21 00:57:19.521352 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Jan 21 00:57:19.521000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:19.524804 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 21 00:57:19.535254 systemd-networkd[726]: lo: Link UP Jan 21 00:57:19.535263 systemd-networkd[726]: lo: Gained carrier Jan 21 00:57:19.538000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:19.538193 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 21 00:57:19.538883 systemd[1]: Reached target network.target - Network. Jan 21 00:57:19.616314 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 21 00:57:19.616000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:19.619552 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Jan 21 00:57:19.713490 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Jan 21 00:57:19.723163 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Jan 21 00:57:19.746223 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Jan 21 00:57:19.759122 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Jan 21 00:57:19.761046 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Jan 21 00:57:19.790734 kernel: cryptd: max_cpu_qlen set to 1000 Jan 21 00:57:19.801060 disk-uuid[796]: Primary Header is updated. Jan 21 00:57:19.801060 disk-uuid[796]: Secondary Entries is updated. Jan 21 00:57:19.801060 disk-uuid[796]: Secondary Header is updated. Jan 21 00:57:19.801155 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 21 00:57:19.807917 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 21 00:57:19.807000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:19.808591 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Jan 21 00:57:19.815162 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 21 00:57:19.825719 kernel: hid: raw HID events driver (C) Jiri Kosina Jan 21 00:57:19.836705 kernel: AES CTR mode by8 optimization enabled Jan 21 00:57:19.880045 kernel: usbcore: registered new interface driver usbhid Jan 21 00:57:19.880101 kernel: usbhid: USB HID core driver Jan 21 00:57:19.900000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:19.910707 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input2 Jan 21 00:57:19.914744 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.0/0000:01:00.0/0000:02:01.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input3 Jan 21 00:57:19.919710 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:01.0-1/input0 Jan 21 00:57:19.925578 systemd-networkd[726]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 21 00:57:19.926638 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 21 00:57:19.926717 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 21 00:57:19.927141 systemd-networkd[726]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 21 00:57:19.929000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:19.929767 systemd-networkd[726]: eth0: Link UP Jan 21 00:57:19.929816 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Jan 21 00:57:19.929965 systemd-networkd[726]: eth0: Gained carrier Jan 21 00:57:19.929979 systemd-networkd[726]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 21 00:57:19.934805 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 21 00:57:19.970703 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 21 00:57:19.970000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:19.981509 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Jan 21 00:57:19.981000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:19.982465 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Jan 21 00:57:19.983135 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 21 00:57:19.983908 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 21 00:57:19.985619 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Jan 21 00:57:19.996783 systemd-networkd[726]: eth0: DHCPv4 address 10.0.5.74/25, gateway 10.0.5.1 acquired from 10.0.5.1 Jan 21 00:57:20.009390 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Jan 21 00:57:20.009000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:20.895998 disk-uuid[797]: Warning: The kernel is still using the old partition table. Jan 21 00:57:20.895998 disk-uuid[797]: The new table will be used at the next reboot or after you Jan 21 00:57:20.895998 disk-uuid[797]: run partprobe(8) or kpartx(8) Jan 21 00:57:20.895998 disk-uuid[797]: The operation has completed successfully. Jan 21 00:57:20.904700 systemd[1]: disk-uuid.service: Deactivated successfully. Jan 21 00:57:20.913026 kernel: kauditd_printk_skb: 18 callbacks suppressed Jan 21 00:57:20.913051 kernel: audit: type=1130 audit(1768957040.904:29): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:20.913068 kernel: audit: type=1131 audit(1768957040.904:30): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:20.904000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:20.904000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:20.904801 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Jan 21 00:57:20.907808 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Jan 21 00:57:20.949707 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (919) Jan 21 00:57:20.954088 kernel: BTRFS info (device vda6): first mount of filesystem f0e9d057-8632-47ff-9f6c-54c0e93bf1a9 Jan 21 00:57:20.954165 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Jan 21 00:57:20.963010 kernel: BTRFS info (device vda6): turning on async discard Jan 21 00:57:20.963103 kernel: BTRFS info (device vda6): enabling free space tree Jan 21 00:57:20.971731 kernel: BTRFS info (device vda6): last unmount of filesystem f0e9d057-8632-47ff-9f6c-54c0e93bf1a9 Jan 21 00:57:20.971792 kernel: audit: type=1130 audit(1768957040.970:31): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:20.970000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:20.970894 systemd[1]: Finished ignition-setup.service - Ignition (setup). Jan 21 00:57:20.977116 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Jan 21 00:57:21.197948 ignition[938]: Ignition 2.24.0 Jan 21 00:57:21.197959 ignition[938]: Stage: fetch-offline Jan 21 00:57:21.197988 ignition[938]: no configs at "/usr/lib/ignition/base.d" Jan 21 00:57:21.200000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:21.204739 kernel: audit: type=1130 audit(1768957041.200:32): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:21.197996 ignition[938]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 21 00:57:21.200654 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Jan 21 00:57:21.198063 ignition[938]: parsed url from cmdline: "" Jan 21 00:57:21.203805 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Jan 21 00:57:21.198066 ignition[938]: no config URL provided Jan 21 00:57:21.198070 ignition[938]: reading system config file "/usr/lib/ignition/user.ign" Jan 21 00:57:21.198076 ignition[938]: no config at "/usr/lib/ignition/user.ign" Jan 21 00:57:21.198081 ignition[938]: failed to fetch config: resource requires networking Jan 21 00:57:21.198215 ignition[938]: Ignition finished successfully Jan 21 00:57:21.225803 ignition[944]: Ignition 2.24.0 Jan 21 00:57:21.225814 ignition[944]: Stage: fetch Jan 21 00:57:21.225960 ignition[944]: no configs at "/usr/lib/ignition/base.d" Jan 21 00:57:21.225967 ignition[944]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 21 00:57:21.226048 ignition[944]: parsed url from cmdline: "" Jan 21 00:57:21.226051 ignition[944]: no config URL provided Jan 21 00:57:21.226059 ignition[944]: reading system config file "/usr/lib/ignition/user.ign" Jan 21 00:57:21.226065 ignition[944]: no config at "/usr/lib/ignition/user.ign" Jan 21 00:57:21.226171 ignition[944]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Jan 21 00:57:21.226185 ignition[944]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Jan 21 00:57:21.226201 ignition[944]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 Jan 21 00:57:21.419871 systemd-networkd[726]: eth0: Gained IPv6LL Jan 21 00:57:21.568566 ignition[944]: GET result: OK Jan 21 00:57:21.569470 ignition[944]: parsing config with SHA512: 2b251d4c74b412aa5ab092e7b9d6f8ee9523e50f31a61d3b3fd01509442f96734f3bf787b017221c094a642ce5d0b9a57f89e5ef72d1c0f18c48c157ae8ba8d7 Jan 21 00:57:21.575456 unknown[944]: fetched base config from "system" Jan 21 00:57:21.575465 unknown[944]: fetched base config from "system" Jan 21 00:57:21.575745 ignition[944]: fetch: fetch complete Jan 21 00:57:21.575470 unknown[944]: fetched user config from "openstack" Jan 21 00:57:21.575750 ignition[944]: fetch: fetch passed Jan 21 00:57:21.575785 ignition[944]: Ignition finished successfully Jan 21 00:57:21.578442 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Jan 21 00:57:21.578000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:21.583375 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Jan 21 00:57:21.583767 kernel: audit: type=1130 audit(1768957041.578:33): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:21.598514 ignition[950]: Ignition 2.24.0 Jan 21 00:57:21.598527 ignition[950]: Stage: kargs Jan 21 00:57:21.598675 ignition[950]: no configs at "/usr/lib/ignition/base.d" Jan 21 00:57:21.598694 ignition[950]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 21 00:57:21.599467 ignition[950]: kargs: kargs passed Jan 21 00:57:21.601066 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Jan 21 00:57:21.599505 ignition[950]: Ignition finished successfully Jan 21 00:57:21.606209 kernel: audit: type=1130 audit(1768957041.601:34): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:21.601000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:21.603371 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Jan 21 00:57:21.631562 ignition[956]: Ignition 2.24.0 Jan 21 00:57:21.631575 ignition[956]: Stage: disks Jan 21 00:57:21.631750 ignition[956]: no configs at "/usr/lib/ignition/base.d" Jan 21 00:57:21.631759 ignition[956]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 21 00:57:21.632561 ignition[956]: disks: disks passed Jan 21 00:57:21.633578 systemd[1]: Finished ignition-disks.service - Ignition (disks). Jan 21 00:57:21.637719 kernel: audit: type=1130 audit(1768957041.633:35): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:21.633000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:21.632602 ignition[956]: Ignition finished successfully Jan 21 00:57:21.634890 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Jan 21 00:57:21.638075 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Jan 21 00:57:21.638670 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 21 00:57:21.639332 systemd[1]: Reached target sysinit.target - System Initialization. Jan 21 00:57:21.640016 systemd[1]: Reached target basic.target - Basic System. Jan 21 00:57:21.641582 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Jan 21 00:57:21.684293 systemd-fsck[964]: ROOT: clean, 15/1631200 files, 112378/1617920 blocks Jan 21 00:57:21.686560 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Jan 21 00:57:21.691411 kernel: audit: type=1130 audit(1768957041.686:36): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:21.686000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:21.688694 systemd[1]: Mounting sysroot.mount - /sysroot... Jan 21 00:57:21.870705 kernel: EXT4-fs (vda9): mounted filesystem cf9e7296-d0ad-4d9a-b030-d4e17a1c88bf r/w with ordered data mode. Quota mode: none. Jan 21 00:57:21.871436 systemd[1]: Mounted sysroot.mount - /sysroot. Jan 21 00:57:21.872460 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Jan 21 00:57:21.876678 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 21 00:57:21.879758 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Jan 21 00:57:21.880759 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Jan 21 00:57:21.884629 systemd[1]: Starting flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent... Jan 21 00:57:21.885490 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Jan 21 00:57:21.886231 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Jan 21 00:57:21.887977 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Jan 21 00:57:21.896162 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Jan 21 00:57:21.903706 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (972) Jan 21 00:57:21.907634 kernel: BTRFS info (device vda6): first mount of filesystem f0e9d057-8632-47ff-9f6c-54c0e93bf1a9 Jan 21 00:57:21.907677 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Jan 21 00:57:21.916754 kernel: BTRFS info (device vda6): turning on async discard Jan 21 00:57:21.916809 kernel: BTRFS info (device vda6): enabling free space tree Jan 21 00:57:21.919488 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 21 00:57:21.985741 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 21 00:57:22.125728 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Jan 21 00:57:22.130801 kernel: audit: type=1130 audit(1768957042.125:37): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:22.125000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:22.127149 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Jan 21 00:57:22.132421 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Jan 21 00:57:22.146843 systemd[1]: sysroot-oem.mount: Deactivated successfully. Jan 21 00:57:22.148716 kernel: BTRFS info (device vda6): last unmount of filesystem f0e9d057-8632-47ff-9f6c-54c0e93bf1a9 Jan 21 00:57:22.172742 ignition[1073]: INFO : Ignition 2.24.0 Jan 21 00:57:22.172742 ignition[1073]: INFO : Stage: mount Jan 21 00:57:22.172742 ignition[1073]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 21 00:57:22.172742 ignition[1073]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 21 00:57:22.175673 ignition[1073]: INFO : mount: mount passed Jan 21 00:57:22.175673 ignition[1073]: INFO : Ignition finished successfully Jan 21 00:57:22.174828 systemd[1]: Finished ignition-mount.service - Ignition (mount). Jan 21 00:57:22.180380 kernel: audit: type=1130 audit(1768957042.176:38): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:22.176000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:22.178666 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Jan 21 00:57:22.180000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:23.032705 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 21 00:57:25.042709 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 21 00:57:29.047712 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 21 00:57:29.052583 coreos-metadata[974]: Jan 21 00:57:29.052 WARN failed to locate config-drive, using the metadata service API instead Jan 21 00:57:29.066912 coreos-metadata[974]: Jan 21 00:57:29.066 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Jan 21 00:57:29.199789 coreos-metadata[974]: Jan 21 00:57:29.199 INFO Fetch successful Jan 21 00:57:29.199789 coreos-metadata[974]: Jan 21 00:57:29.199 INFO wrote hostname ci-4547-0-0-n-af1f1f5a24 to /sysroot/etc/hostname Jan 21 00:57:29.201994 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. Jan 21 00:57:29.210636 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 21 00:57:29.210661 kernel: audit: type=1130 audit(1768957049.201:40): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:29.210674 kernel: audit: type=1131 audit(1768957049.201:41): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:29.201000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:29.201000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:29.202107 systemd[1]: Finished flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent. Jan 21 00:57:29.203101 systemd[1]: Starting ignition-files.service - Ignition (files)... Jan 21 00:57:29.227440 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 21 00:57:29.262725 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (1090) Jan 21 00:57:29.266638 kernel: BTRFS info (device vda6): first mount of filesystem f0e9d057-8632-47ff-9f6c-54c0e93bf1a9 Jan 21 00:57:29.266701 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Jan 21 00:57:29.279939 kernel: BTRFS info (device vda6): turning on async discard Jan 21 00:57:29.280032 kernel: BTRFS info (device vda6): enabling free space tree Jan 21 00:57:29.281703 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 21 00:57:29.305582 ignition[1107]: INFO : Ignition 2.24.0 Jan 21 00:57:29.305582 ignition[1107]: INFO : Stage: files Jan 21 00:57:29.306833 ignition[1107]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 21 00:57:29.306833 ignition[1107]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 21 00:57:29.306833 ignition[1107]: DEBUG : files: compiled without relabeling support, skipping Jan 21 00:57:29.308013 ignition[1107]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Jan 21 00:57:29.308013 ignition[1107]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Jan 21 00:57:29.315259 ignition[1107]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Jan 21 00:57:29.316107 ignition[1107]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Jan 21 00:57:29.316107 ignition[1107]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Jan 21 00:57:29.315752 unknown[1107]: wrote ssh authorized keys file for user: core Jan 21 00:57:29.319328 ignition[1107]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Jan 21 00:57:29.320119 ignition[1107]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.0-linux-amd64.tar.gz: attempt #1 Jan 21 00:57:29.375080 ignition[1107]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Jan 21 00:57:29.490104 ignition[1107]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Jan 21 00:57:29.491600 ignition[1107]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Jan 21 00:57:29.491600 ignition[1107]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Jan 21 00:57:29.491600 ignition[1107]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Jan 21 00:57:29.491600 ignition[1107]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Jan 21 00:57:29.491600 ignition[1107]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 21 00:57:29.491600 ignition[1107]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 21 00:57:29.491600 ignition[1107]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 21 00:57:29.491600 ignition[1107]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 21 00:57:29.495371 ignition[1107]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Jan 21 00:57:29.495371 ignition[1107]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Jan 21 00:57:29.495371 ignition[1107]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Jan 21 00:57:29.497560 ignition[1107]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Jan 21 00:57:29.498514 ignition[1107]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Jan 21 00:57:29.498514 ignition[1107]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.32.4-x86-64.raw: attempt #1 Jan 21 00:57:29.866945 ignition[1107]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Jan 21 00:57:31.144383 ignition[1107]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Jan 21 00:57:31.144383 ignition[1107]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Jan 21 00:57:31.147926 ignition[1107]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 21 00:57:31.153060 ignition[1107]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 21 00:57:31.153060 ignition[1107]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Jan 21 00:57:31.153060 ignition[1107]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Jan 21 00:57:31.155878 ignition[1107]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Jan 21 00:57:31.155878 ignition[1107]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Jan 21 00:57:31.155878 ignition[1107]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Jan 21 00:57:31.155878 ignition[1107]: INFO : files: files passed Jan 21 00:57:31.155878 ignition[1107]: INFO : Ignition finished successfully Jan 21 00:57:31.162891 kernel: audit: type=1130 audit(1768957051.155:42): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:31.155000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:31.155753 systemd[1]: Finished ignition-files.service - Ignition (files). Jan 21 00:57:31.157138 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Jan 21 00:57:31.162499 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Jan 21 00:57:31.170264 systemd[1]: ignition-quench.service: Deactivated successfully. Jan 21 00:57:31.170362 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Jan 21 00:57:31.180417 kernel: audit: type=1130 audit(1768957051.170:43): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:31.180444 kernel: audit: type=1131 audit(1768957051.171:44): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:31.170000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:31.171000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:31.180521 initrd-setup-root-after-ignition[1139]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 21 00:57:31.180521 initrd-setup-root-after-ignition[1139]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Jan 21 00:57:31.181537 initrd-setup-root-after-ignition[1143]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 21 00:57:31.183060 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 21 00:57:31.187517 kernel: audit: type=1130 audit(1768957051.182:45): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:31.182000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:31.183912 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Jan 21 00:57:31.189109 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Jan 21 00:57:31.231710 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Jan 21 00:57:31.231806 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Jan 21 00:57:31.240247 kernel: audit: type=1130 audit(1768957051.232:46): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:31.240276 kernel: audit: type=1131 audit(1768957051.232:47): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:31.232000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:31.232000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:31.233206 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Jan 21 00:57:31.240803 systemd[1]: Reached target initrd.target - Initrd Default Target. Jan 21 00:57:31.241848 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Jan 21 00:57:31.242786 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Jan 21 00:57:31.263439 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 21 00:57:31.268820 kernel: audit: type=1130 audit(1768957051.263:48): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:31.263000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:31.265645 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Jan 21 00:57:31.284452 systemd[1]: Unnecessary job was removed for dev-mapper-usr.device - /dev/mapper/usr. Jan 21 00:57:31.284649 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Jan 21 00:57:31.285882 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 21 00:57:31.286862 systemd[1]: Stopped target timers.target - Timer Units. Jan 21 00:57:31.287795 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Jan 21 00:57:31.292638 kernel: audit: type=1131 audit(1768957051.288:49): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:31.288000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:31.287908 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 21 00:57:31.292747 systemd[1]: Stopped target initrd.target - Initrd Default Target. Jan 21 00:57:31.293857 systemd[1]: Stopped target basic.target - Basic System. Jan 21 00:57:31.294804 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Jan 21 00:57:31.295823 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Jan 21 00:57:31.296694 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Jan 21 00:57:31.297570 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Jan 21 00:57:31.298537 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Jan 21 00:57:31.299512 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Jan 21 00:57:31.300404 systemd[1]: Stopped target sysinit.target - System Initialization. Jan 21 00:57:31.301271 systemd[1]: Stopped target local-fs.target - Local File Systems. Jan 21 00:57:31.302191 systemd[1]: Stopped target swap.target - Swaps. Jan 21 00:57:31.303043 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Jan 21 00:57:31.303000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:31.303160 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Jan 21 00:57:31.304391 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Jan 21 00:57:31.304873 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 21 00:57:31.305644 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Jan 21 00:57:31.306000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:31.305727 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 21 00:57:31.306508 systemd[1]: dracut-initqueue.service: Deactivated successfully. Jan 21 00:57:31.308000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:31.306605 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Jan 21 00:57:31.308000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:31.307779 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Jan 21 00:57:31.307894 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 21 00:57:31.308748 systemd[1]: ignition-files.service: Deactivated successfully. Jan 21 00:57:31.308852 systemd[1]: Stopped ignition-files.service - Ignition (files). Jan 21 00:57:31.311874 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Jan 21 00:57:31.312305 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Jan 21 00:57:31.312000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:31.312480 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Jan 21 00:57:31.315784 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Jan 21 00:57:31.316231 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Jan 21 00:57:31.316000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:31.316383 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 21 00:57:31.317000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:31.317088 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Jan 21 00:57:31.317000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:31.317190 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Jan 21 00:57:31.317781 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Jan 21 00:57:31.318099 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Jan 21 00:57:31.321516 systemd[1]: initrd-cleanup.service: Deactivated successfully. Jan 21 00:57:31.324230 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Jan 21 00:57:31.324000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:31.324000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:31.338269 ignition[1164]: INFO : Ignition 2.24.0 Jan 21 00:57:31.340326 ignition[1164]: INFO : Stage: umount Jan 21 00:57:31.340326 ignition[1164]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 21 00:57:31.340326 ignition[1164]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 21 00:57:31.340326 ignition[1164]: INFO : umount: umount passed Jan 21 00:57:31.340326 ignition[1164]: INFO : Ignition finished successfully Jan 21 00:57:31.343622 systemd[1]: ignition-mount.service: Deactivated successfully. Jan 21 00:57:31.343755 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Jan 21 00:57:31.343000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:31.344000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:31.344876 systemd[1]: ignition-disks.service: Deactivated successfully. Jan 21 00:57:31.344920 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Jan 21 00:57:31.345000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:31.345766 systemd[1]: ignition-kargs.service: Deactivated successfully. Jan 21 00:57:31.347000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:31.345808 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Jan 21 00:57:31.348000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:31.346385 systemd[1]: ignition-fetch.service: Deactivated successfully. Jan 21 00:57:31.346425 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Jan 21 00:57:31.348169 systemd[1]: Stopped target network.target - Network. Jan 21 00:57:31.348731 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Jan 21 00:57:31.348774 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Jan 21 00:57:31.349761 systemd[1]: Stopped target paths.target - Path Units. Jan 21 00:57:31.350255 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Jan 21 00:57:31.351151 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 21 00:57:31.351784 systemd[1]: Stopped target slices.target - Slice Units. Jan 21 00:57:31.352124 systemd[1]: Stopped target sockets.target - Socket Units. Jan 21 00:57:31.352505 systemd[1]: iscsid.socket: Deactivated successfully. Jan 21 00:57:31.352543 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Jan 21 00:57:31.352909 systemd[1]: iscsiuio.socket: Deactivated successfully. Jan 21 00:57:31.356000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:31.356000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup-pre comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:31.352936 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 21 00:57:31.353265 systemd[1]: systemd-journald-audit.socket: Deactivated successfully. Jan 21 00:57:31.353285 systemd[1]: Closed systemd-journald-audit.socket - Journal Audit Socket. Jan 21 00:57:31.353599 systemd[1]: ignition-setup.service: Deactivated successfully. Jan 21 00:57:31.353646 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Jan 21 00:57:31.356801 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Jan 21 00:57:31.356841 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Jan 21 00:57:31.357355 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Jan 21 00:57:31.357699 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Jan 21 00:57:31.360318 systemd[1]: sysroot-boot.mount: Deactivated successfully. Jan 21 00:57:31.364150 systemd[1]: systemd-resolved.service: Deactivated successfully. Jan 21 00:57:31.364245 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Jan 21 00:57:31.364000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:31.367000 audit: BPF prog-id=6 op=UNLOAD Jan 21 00:57:31.370441 systemd[1]: systemd-networkd.service: Deactivated successfully. Jan 21 00:57:31.371081 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Jan 21 00:57:31.371000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:31.373309 systemd[1]: Stopped target network-pre.target - Preparation for Network. Jan 21 00:57:31.374197 systemd[1]: systemd-networkd.socket: Deactivated successfully. Jan 21 00:57:31.374000 audit: BPF prog-id=9 op=UNLOAD Jan 21 00:57:31.375162 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Jan 21 00:57:31.377216 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Jan 21 00:57:31.377928 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Jan 21 00:57:31.378285 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 21 00:57:31.378000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:31.379132 systemd[1]: systemd-sysctl.service: Deactivated successfully. Jan 21 00:57:31.379507 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Jan 21 00:57:31.379000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:31.380285 systemd[1]: systemd-modules-load.service: Deactivated successfully. Jan 21 00:57:31.380723 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Jan 21 00:57:31.380000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:31.381131 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 21 00:57:31.382482 systemd[1]: sysroot-boot.service: Deactivated successfully. Jan 21 00:57:31.383000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:31.384017 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Jan 21 00:57:31.384000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:31.384858 systemd[1]: initrd-setup-root.service: Deactivated successfully. Jan 21 00:57:31.384941 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Jan 21 00:57:31.387227 systemd[1]: systemd-udevd.service: Deactivated successfully. Jan 21 00:57:31.387000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:31.387342 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 21 00:57:31.387902 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Jan 21 00:57:31.387934 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Jan 21 00:57:31.390130 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Jan 21 00:57:31.390000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:31.390158 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Jan 21 00:57:31.390506 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Jan 21 00:57:31.391000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:31.390547 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Jan 21 00:57:31.392000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:31.391299 systemd[1]: dracut-cmdline.service: Deactivated successfully. Jan 21 00:57:31.391331 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Jan 21 00:57:31.392571 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jan 21 00:57:31.392611 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 21 00:57:31.394406 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Jan 21 00:57:31.395000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:31.396000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:31.396000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:31.396126 systemd[1]: systemd-network-generator.service: Deactivated successfully. Jan 21 00:57:31.396168 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Jan 21 00:57:31.396557 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Jan 21 00:57:31.396588 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 21 00:57:31.396966 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 21 00:57:31.396997 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 21 00:57:31.412534 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Jan 21 00:57:31.412000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:31.412000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:31.412793 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Jan 21 00:57:31.413000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=network-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:31.413737 systemd[1]: network-cleanup.service: Deactivated successfully. Jan 21 00:57:31.413820 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Jan 21 00:57:31.414880 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Jan 21 00:57:31.416852 systemd[1]: Starting initrd-switch-root.service - Switch Root... Jan 21 00:57:31.430734 systemd[1]: Switching root. Jan 21 00:57:31.477842 systemd-journald[343]: Journal stopped Jan 21 00:57:32.686018 systemd-journald[343]: Received SIGTERM from PID 1 (systemd). Jan 21 00:57:32.686099 kernel: SELinux: policy capability network_peer_controls=1 Jan 21 00:57:32.686114 kernel: SELinux: policy capability open_perms=1 Jan 21 00:57:32.686129 kernel: SELinux: policy capability extended_socket_class=1 Jan 21 00:57:32.686145 kernel: SELinux: policy capability always_check_network=0 Jan 21 00:57:32.686156 kernel: SELinux: policy capability cgroup_seclabel=1 Jan 21 00:57:32.686170 kernel: SELinux: policy capability nnp_nosuid_transition=1 Jan 21 00:57:32.686187 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Jan 21 00:57:32.686198 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Jan 21 00:57:32.686212 kernel: SELinux: policy capability userspace_initial_context=0 Jan 21 00:57:32.686224 systemd[1]: Successfully loaded SELinux policy in 72.360ms. Jan 21 00:57:32.686243 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 5.670ms. Jan 21 00:57:32.686258 systemd[1]: systemd 257.9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +IPE +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -BTF -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jan 21 00:57:32.686270 systemd[1]: Detected virtualization kvm. Jan 21 00:57:32.686286 systemd[1]: Detected architecture x86-64. Jan 21 00:57:32.686297 systemd[1]: Detected first boot. Jan 21 00:57:32.686309 systemd[1]: Hostname set to . Jan 21 00:57:32.686321 systemd[1]: Initializing machine ID from SMBIOS/DMI UUID. Jan 21 00:57:32.686336 zram_generator::config[1207]: No configuration found. Jan 21 00:57:32.686352 kernel: Guest personality initialized and is inactive Jan 21 00:57:32.686365 kernel: VMCI host device registered (name=vmci, major=10, minor=258) Jan 21 00:57:32.686378 kernel: Initialized host personality Jan 21 00:57:32.686389 kernel: NET: Registered PF_VSOCK protocol family Jan 21 00:57:32.686401 systemd[1]: Populated /etc with preset unit settings. Jan 21 00:57:32.686412 systemd[1]: initrd-switch-root.service: Deactivated successfully. Jan 21 00:57:32.686424 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Jan 21 00:57:32.686435 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Jan 21 00:57:32.686452 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Jan 21 00:57:32.686464 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Jan 21 00:57:32.686475 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Jan 21 00:57:32.686486 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Jan 21 00:57:32.686498 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Jan 21 00:57:32.686510 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Jan 21 00:57:32.686521 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Jan 21 00:57:32.686534 systemd[1]: Created slice user.slice - User and Session Slice. Jan 21 00:57:32.686546 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 21 00:57:32.686558 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 21 00:57:32.686569 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Jan 21 00:57:32.686580 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Jan 21 00:57:32.686592 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Jan 21 00:57:32.686605 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 21 00:57:32.686617 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Jan 21 00:57:32.686632 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 21 00:57:32.686648 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 21 00:57:32.686659 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Jan 21 00:57:32.686670 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Jan 21 00:57:32.686695 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Jan 21 00:57:32.686707 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Jan 21 00:57:32.686719 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 21 00:57:32.686730 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 21 00:57:32.686743 systemd[1]: Reached target remote-veritysetup.target - Remote Verity Protected Volumes. Jan 21 00:57:32.686754 systemd[1]: Reached target slices.target - Slice Units. Jan 21 00:57:32.686765 systemd[1]: Reached target swap.target - Swaps. Jan 21 00:57:32.686778 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Jan 21 00:57:32.686790 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Jan 21 00:57:32.686801 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Jan 21 00:57:32.686812 systemd[1]: Listening on systemd-journald-audit.socket - Journal Audit Socket. Jan 21 00:57:32.686823 systemd[1]: Listening on systemd-mountfsd.socket - DDI File System Mounter Socket. Jan 21 00:57:32.686834 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 21 00:57:32.686847 systemd[1]: Listening on systemd-nsresourced.socket - Namespace Resource Manager Socket. Jan 21 00:57:32.686860 systemd[1]: Listening on systemd-oomd.socket - Userspace Out-Of-Memory (OOM) Killer Socket. Jan 21 00:57:32.686872 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 21 00:57:32.686883 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 21 00:57:32.686894 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Jan 21 00:57:32.686905 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Jan 21 00:57:32.686916 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Jan 21 00:57:32.686928 systemd[1]: Mounting media.mount - External Media Directory... Jan 21 00:57:32.686942 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 21 00:57:32.686953 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Jan 21 00:57:32.686964 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Jan 21 00:57:32.686975 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Jan 21 00:57:32.686986 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Jan 21 00:57:32.686997 systemd[1]: Reached target machines.target - Containers. Jan 21 00:57:32.687009 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Jan 21 00:57:32.687022 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 21 00:57:32.687034 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 21 00:57:32.687045 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Jan 21 00:57:32.687056 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 21 00:57:32.687068 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 21 00:57:32.687079 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 21 00:57:32.687092 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Jan 21 00:57:32.687104 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 21 00:57:32.687119 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Jan 21 00:57:32.687130 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Jan 21 00:57:32.687143 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Jan 21 00:57:32.687154 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Jan 21 00:57:32.687166 systemd[1]: Stopped systemd-fsck-usr.service. Jan 21 00:57:32.687178 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 21 00:57:32.687189 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 21 00:57:32.687199 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 21 00:57:32.687211 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jan 21 00:57:32.687223 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Jan 21 00:57:32.687234 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Jan 21 00:57:32.687245 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 21 00:57:32.687256 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 21 00:57:32.687267 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Jan 21 00:57:32.687280 kernel: fuse: init (API version 7.41) Jan 21 00:57:32.687290 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Jan 21 00:57:32.687301 systemd[1]: Mounted media.mount - External Media Directory. Jan 21 00:57:32.687313 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Jan 21 00:57:32.687323 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Jan 21 00:57:32.687334 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Jan 21 00:57:32.687347 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 21 00:57:32.687358 systemd[1]: modprobe@configfs.service: Deactivated successfully. Jan 21 00:57:32.687368 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Jan 21 00:57:32.687379 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 21 00:57:32.687390 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 21 00:57:32.687401 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 21 00:57:32.687412 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 21 00:57:32.687425 systemd[1]: modprobe@fuse.service: Deactivated successfully. Jan 21 00:57:32.687436 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Jan 21 00:57:32.687447 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 21 00:57:32.687458 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 21 00:57:32.687469 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 21 00:57:32.687480 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jan 21 00:57:32.687491 kernel: ACPI: bus type drm_connector registered Jan 21 00:57:32.687503 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Jan 21 00:57:32.687515 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 21 00:57:32.687526 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 21 00:57:32.687539 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Jan 21 00:57:32.687550 systemd[1]: Reached target network-pre.target - Preparation for Network. Jan 21 00:57:32.687563 systemd[1]: Listening on systemd-importd.socket - Disk Image Download Service Socket. Jan 21 00:57:32.687574 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Jan 21 00:57:32.687585 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 21 00:57:32.687598 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Jan 21 00:57:32.687609 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 21 00:57:32.687620 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Jan 21 00:57:32.687656 systemd-journald[1282]: Collecting audit messages is enabled. Jan 21 00:57:32.688534 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Jan 21 00:57:32.688565 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 21 00:57:32.688584 systemd-journald[1282]: Journal started Jan 21 00:57:32.688611 systemd-journald[1282]: Runtime Journal (/run/log/journal/946d775c848e45838c905fb1affa53e4) is 8M, max 77.9M, 69.9M free. Jan 21 00:57:32.522000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:32.528000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck-usr comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:32.532000 audit: BPF prog-id=14 op=UNLOAD Jan 21 00:57:32.532000 audit: BPF prog-id=13 op=UNLOAD Jan 21 00:57:32.535000 audit: BPF prog-id=15 op=LOAD Jan 21 00:57:32.535000 audit: BPF prog-id=16 op=LOAD Jan 21 00:57:32.535000 audit: BPF prog-id=17 op=LOAD Jan 21 00:57:32.607000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:32.615000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:32.615000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:32.619000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:32.619000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:32.628000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:32.628000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:32.632000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:32.632000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:32.636000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:32.636000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:32.639000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:32.641000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:32.644000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-remount-fs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:32.653000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:32.653000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:32.658000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-load-credentials comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:32.678000 audit: CONFIG_CHANGE op=set audit_enabled=1 old=1 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 res=1 Jan 21 00:57:32.678000 audit[1282]: SYSCALL arch=c000003e syscall=46 success=yes exit=60 a0=6 a1=7ffc867147b0 a2=4000 a3=0 items=0 ppid=1 pid=1282 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="systemd-journal" exe="/usr/lib/systemd/systemd-journald" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:57:32.678000 audit: PROCTITLE proctitle="/usr/lib/systemd/systemd-journald" Jan 21 00:57:32.328030 systemd[1]: Queued start job for default target multi-user.target. Jan 21 00:57:32.351710 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Jan 21 00:57:32.352129 systemd[1]: systemd-journald.service: Deactivated successfully. Jan 21 00:57:32.696305 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Jan 21 00:57:32.696393 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 21 00:57:32.699394 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 21 00:57:32.705699 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Jan 21 00:57:32.708692 systemd[1]: Started systemd-journald.service - Journal Service. Jan 21 00:57:32.711000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:32.712000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=flatcar-tmpfiles comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:32.712970 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Jan 21 00:57:32.729205 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Jan 21 00:57:32.731801 systemd[1]: Starting systemd-sysusers.service - Create System Users... Jan 21 00:57:32.733737 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Jan 21 00:57:32.734000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-random-seed comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:32.735396 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Jan 21 00:57:32.743952 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Jan 21 00:57:32.746000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:32.746200 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 21 00:57:32.759702 kernel: loop1: detected capacity change from 0 to 1656 Jan 21 00:57:32.779819 systemd-journald[1282]: Time spent on flushing to /var/log/journal/946d775c848e45838c905fb1affa53e4 is 40.720ms for 1846 entries. Jan 21 00:57:32.779819 systemd-journald[1282]: System Journal (/var/log/journal/946d775c848e45838c905fb1affa53e4) is 8M, max 588.1M, 580.1M free. Jan 21 00:57:32.835267 systemd-journald[1282]: Received client request to flush runtime journal. Jan 21 00:57:32.835301 kernel: loop2: detected capacity change from 0 to 111560 Jan 21 00:57:32.788000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-machine-id-commit comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:32.811000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:32.836000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-flush comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:32.788785 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Jan 21 00:57:32.811942 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 21 00:57:32.837006 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Jan 21 00:57:32.840822 systemd[1]: Finished systemd-sysusers.service - Create System Users. Jan 21 00:57:32.840000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysusers comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:32.842000 audit: BPF prog-id=18 op=LOAD Jan 21 00:57:32.842000 audit: BPF prog-id=19 op=LOAD Jan 21 00:57:32.842000 audit: BPF prog-id=20 op=LOAD Jan 21 00:57:32.848000 audit: BPF prog-id=21 op=LOAD Jan 21 00:57:32.847825 systemd[1]: Starting systemd-oomd.service - Userspace Out-Of-Memory (OOM) Killer... Jan 21 00:57:32.849921 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 21 00:57:32.852821 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 21 00:57:32.858009 kernel: loop3: detected capacity change from 0 to 50784 Jan 21 00:57:32.858000 audit: BPF prog-id=22 op=LOAD Jan 21 00:57:32.859000 audit: BPF prog-id=23 op=LOAD Jan 21 00:57:32.859000 audit: BPF prog-id=24 op=LOAD Jan 21 00:57:32.860361 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Jan 21 00:57:32.862000 audit: BPF prog-id=25 op=LOAD Jan 21 00:57:32.862000 audit: BPF prog-id=26 op=LOAD Jan 21 00:57:32.863000 audit: BPF prog-id=27 op=LOAD Jan 21 00:57:32.864836 systemd[1]: Starting systemd-nsresourced.service - Namespace Resource Manager... Jan 21 00:57:32.902943 systemd-tmpfiles[1351]: ACLs are not supported, ignoring. Jan 21 00:57:32.902957 systemd-tmpfiles[1351]: ACLs are not supported, ignoring. Jan 21 00:57:32.915000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:32.913828 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 21 00:57:32.916698 kernel: loop4: detected capacity change from 0 to 224512 Jan 21 00:57:32.930672 systemd-nsresourced[1353]: Not setting up BPF subsystem, as functionality has been disabled at compile time. Jan 21 00:57:32.935705 systemd[1]: Started systemd-nsresourced.service - Namespace Resource Manager. Jan 21 00:57:32.936000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-nsresourced comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:32.937840 systemd[1]: Started systemd-userdbd.service - User Database Manager. Jan 21 00:57:32.938000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-userdbd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:32.971705 kernel: loop5: detected capacity change from 0 to 1656 Jan 21 00:57:32.989826 kernel: loop6: detected capacity change from 0 to 111560 Jan 21 00:57:32.998832 systemd-oomd[1349]: No swap; memory pressure usage will be degraded Jan 21 00:57:32.999468 systemd[1]: Started systemd-oomd.service - Userspace Out-Of-Memory (OOM) Killer. Jan 21 00:57:32.999000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-oomd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:33.015699 kernel: loop7: detected capacity change from 0 to 50784 Jan 21 00:57:33.029446 systemd-resolved[1350]: Positive Trust Anchors: Jan 21 00:57:33.029462 systemd-resolved[1350]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 21 00:57:33.029466 systemd-resolved[1350]: . IN DS 38696 8 2 683d2d0acb8c9b712a1948b27f741219298d0a450d612c483af444a4c0fb2b16 Jan 21 00:57:33.029498 systemd-resolved[1350]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 21 00:57:33.050090 systemd-resolved[1350]: Using system hostname 'ci-4547-0-0-n-af1f1f5a24'. Jan 21 00:57:33.054116 kernel: loop1: detected capacity change from 0 to 224512 Jan 21 00:57:33.051000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:33.051921 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 21 00:57:33.052620 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 21 00:57:33.084270 (sd-merge)[1372]: Using extensions 'containerd-flatcar.raw', 'docker-flatcar.raw', 'kubernetes.raw', 'oem-stackit.raw'. Jan 21 00:57:33.087845 (sd-merge)[1372]: Merged extensions into '/usr'. Jan 21 00:57:33.094208 systemd[1]: Reload requested from client PID 1312 ('systemd-sysext') (unit systemd-sysext.service)... Jan 21 00:57:33.094310 systemd[1]: Reloading... Jan 21 00:57:33.167708 zram_generator::config[1403]: No configuration found. Jan 21 00:57:33.345703 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Jan 21 00:57:33.345796 systemd[1]: Reloading finished in 251 ms. Jan 21 00:57:33.382637 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Jan 21 00:57:33.382000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-hwdb-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:33.383476 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Jan 21 00:57:33.383000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysext comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:33.387828 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Jan 21 00:57:33.389123 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Jan 21 00:57:33.396849 systemd[1]: Starting ensure-sysext.service... Jan 21 00:57:33.401751 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 21 00:57:33.401000 audit: BPF prog-id=8 op=UNLOAD Jan 21 00:57:33.401000 audit: BPF prog-id=7 op=UNLOAD Jan 21 00:57:33.404000 audit: BPF prog-id=28 op=LOAD Jan 21 00:57:33.404000 audit: BPF prog-id=29 op=LOAD Jan 21 00:57:33.407816 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 21 00:57:33.409000 audit: BPF prog-id=30 op=LOAD Jan 21 00:57:33.409000 audit: BPF prog-id=22 op=UNLOAD Jan 21 00:57:33.409000 audit: BPF prog-id=31 op=LOAD Jan 21 00:57:33.409000 audit: BPF prog-id=32 op=LOAD Jan 21 00:57:33.409000 audit: BPF prog-id=23 op=UNLOAD Jan 21 00:57:33.409000 audit: BPF prog-id=24 op=UNLOAD Jan 21 00:57:33.409000 audit: BPF prog-id=33 op=LOAD Jan 21 00:57:33.409000 audit: BPF prog-id=25 op=UNLOAD Jan 21 00:57:33.409000 audit: BPF prog-id=34 op=LOAD Jan 21 00:57:33.409000 audit: BPF prog-id=35 op=LOAD Jan 21 00:57:33.409000 audit: BPF prog-id=26 op=UNLOAD Jan 21 00:57:33.409000 audit: BPF prog-id=27 op=UNLOAD Jan 21 00:57:33.410000 audit: BPF prog-id=36 op=LOAD Jan 21 00:57:33.410000 audit: BPF prog-id=18 op=UNLOAD Jan 21 00:57:33.410000 audit: BPF prog-id=37 op=LOAD Jan 21 00:57:33.410000 audit: BPF prog-id=38 op=LOAD Jan 21 00:57:33.410000 audit: BPF prog-id=19 op=UNLOAD Jan 21 00:57:33.410000 audit: BPF prog-id=20 op=UNLOAD Jan 21 00:57:33.411000 audit: BPF prog-id=39 op=LOAD Jan 21 00:57:33.411000 audit: BPF prog-id=21 op=UNLOAD Jan 21 00:57:33.412000 audit: BPF prog-id=40 op=LOAD Jan 21 00:57:33.412000 audit: BPF prog-id=15 op=UNLOAD Jan 21 00:57:33.412000 audit: BPF prog-id=41 op=LOAD Jan 21 00:57:33.412000 audit: BPF prog-id=42 op=LOAD Jan 21 00:57:33.412000 audit: BPF prog-id=16 op=UNLOAD Jan 21 00:57:33.412000 audit: BPF prog-id=17 op=UNLOAD Jan 21 00:57:33.416022 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Jan 21 00:57:33.416648 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Jan 21 00:57:33.422822 systemd[1]: Reload requested from client PID 1448 ('systemctl') (unit ensure-sysext.service)... Jan 21 00:57:33.422836 systemd[1]: Reloading... Jan 21 00:57:33.432230 systemd-tmpfiles[1449]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Jan 21 00:57:33.432259 systemd-tmpfiles[1449]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Jan 21 00:57:33.432552 systemd-tmpfiles[1449]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Jan 21 00:57:33.434321 systemd-tmpfiles[1449]: ACLs are not supported, ignoring. Jan 21 00:57:33.434378 systemd-tmpfiles[1449]: ACLs are not supported, ignoring. Jan 21 00:57:33.445103 systemd-tmpfiles[1449]: Detected autofs mount point /boot during canonicalization of boot. Jan 21 00:57:33.446703 systemd-tmpfiles[1449]: Skipping /boot Jan 21 00:57:33.448848 systemd-udevd[1450]: Using default interface naming scheme 'v257'. Jan 21 00:57:33.458964 systemd-tmpfiles[1449]: Detected autofs mount point /boot during canonicalization of boot. Jan 21 00:57:33.458975 systemd-tmpfiles[1449]: Skipping /boot Jan 21 00:57:33.494704 zram_generator::config[1482]: No configuration found. Jan 21 00:57:33.624740 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input4 Jan 21 00:57:33.644702 kernel: mousedev: PS/2 mouse device common for all mice Jan 21 00:57:33.646702 kernel: ACPI: button: Power Button [PWRF] Jan 21 00:57:33.793812 kernel: i801_smbus 0000:00:1f.3: Enabling SMBus device Jan 21 00:57:33.794093 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Jan 21 00:57:33.794233 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Jan 21 00:57:33.804835 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Jan 21 00:57:33.804918 systemd[1]: Reloading finished in 381 ms. Jan 21 00:57:33.819961 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 21 00:57:33.820000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:33.821000 audit: BPF prog-id=43 op=LOAD Jan 21 00:57:33.821000 audit: BPF prog-id=39 op=UNLOAD Jan 21 00:57:33.821000 audit: BPF prog-id=44 op=LOAD Jan 21 00:57:33.821000 audit: BPF prog-id=33 op=UNLOAD Jan 21 00:57:33.822000 audit: BPF prog-id=45 op=LOAD Jan 21 00:57:33.822000 audit: BPF prog-id=46 op=LOAD Jan 21 00:57:33.822000 audit: BPF prog-id=34 op=UNLOAD Jan 21 00:57:33.822000 audit: BPF prog-id=35 op=UNLOAD Jan 21 00:57:33.822000 audit: BPF prog-id=47 op=LOAD Jan 21 00:57:33.822000 audit: BPF prog-id=48 op=LOAD Jan 21 00:57:33.822000 audit: BPF prog-id=28 op=UNLOAD Jan 21 00:57:33.822000 audit: BPF prog-id=29 op=UNLOAD Jan 21 00:57:33.824000 audit: BPF prog-id=49 op=LOAD Jan 21 00:57:33.824000 audit: BPF prog-id=36 op=UNLOAD Jan 21 00:57:33.824000 audit: BPF prog-id=50 op=LOAD Jan 21 00:57:33.824000 audit: BPF prog-id=51 op=LOAD Jan 21 00:57:33.824000 audit: BPF prog-id=37 op=UNLOAD Jan 21 00:57:33.825000 audit: BPF prog-id=38 op=UNLOAD Jan 21 00:57:33.825000 audit: BPF prog-id=52 op=LOAD Jan 21 00:57:33.825000 audit: BPF prog-id=40 op=UNLOAD Jan 21 00:57:33.826000 audit: BPF prog-id=53 op=LOAD Jan 21 00:57:33.826000 audit: BPF prog-id=54 op=LOAD Jan 21 00:57:33.833000 audit: BPF prog-id=41 op=UNLOAD Jan 21 00:57:33.833000 audit: BPF prog-id=42 op=UNLOAD Jan 21 00:57:33.834000 audit: BPF prog-id=55 op=LOAD Jan 21 00:57:33.834000 audit: BPF prog-id=30 op=UNLOAD Jan 21 00:57:33.834000 audit: BPF prog-id=56 op=LOAD Jan 21 00:57:33.834000 audit: BPF prog-id=57 op=LOAD Jan 21 00:57:33.834000 audit: BPF prog-id=31 op=UNLOAD Jan 21 00:57:33.834000 audit: BPF prog-id=32 op=UNLOAD Jan 21 00:57:33.843700 kernel: [drm] pci: virtio-vga detected at 0000:00:01.0 Jan 21 00:57:33.849703 kernel: Console: switching to colour dummy device 80x25 Jan 21 00:57:33.855937 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 21 00:57:33.875000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:33.878989 kernel: virtio-pci 0000:00:01.0: vgaarb: deactivate vga console Jan 21 00:57:33.879203 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Jan 21 00:57:33.879217 kernel: [drm] features: -context_init Jan 21 00:57:33.886712 kernel: [drm] number of scanouts: 1 Jan 21 00:57:33.910706 kernel: [drm] number of cap sets: 0 Jan 21 00:57:33.914702 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:01.0 on minor 0 Jan 21 00:57:33.920000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=ensure-sysext comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:33.921238 systemd[1]: Finished ensure-sysext.service. Jan 21 00:57:33.925703 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device Jan 21 00:57:33.925765 kernel: Console: switching to colour frame buffer device 160x50 Jan 21 00:57:33.931733 kernel: virtio-pci 0000:00:01.0: [drm] fb0: virtio_gpudrmfb frame buffer device Jan 21 00:57:33.943502 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Jan 21 00:57:33.960547 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 21 00:57:33.962291 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 21 00:57:33.964850 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Jan 21 00:57:33.965286 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 21 00:57:33.966882 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 21 00:57:33.969818 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 21 00:57:33.977377 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 21 00:57:33.981966 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 21 00:57:33.984013 systemd[1]: Starting modprobe@ptp_kvm.service - Load Kernel Module ptp_kvm... Jan 21 00:57:33.984578 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 21 00:57:33.984675 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Jan 21 00:57:33.986937 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Jan 21 00:57:33.989805 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Jan 21 00:57:33.989893 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 21 00:57:33.992864 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Jan 21 00:57:33.999000 audit: BPF prog-id=58 op=LOAD Jan 21 00:57:34.001481 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 21 00:57:34.003833 systemd[1]: Reached target time-set.target - System Time Set. Jan 21 00:57:34.008642 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Jan 21 00:57:34.016660 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 21 00:57:34.017710 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 21 00:57:34.018427 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 21 00:57:34.019000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:34.019000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:34.019799 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 21 00:57:34.021482 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 21 00:57:34.033000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:34.033000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:34.032114 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 21 00:57:34.032364 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 21 00:57:34.075000 audit[1588]: SYSTEM_BOOT pid=1588 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg=' comm="systemd-update-utmp" exe="/usr/lib/systemd/systemd-update-utmp" hostname=? addr=? terminal=? res=success' Jan 21 00:57:34.080610 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 21 00:57:34.082122 kernel: pps_core: LinuxPPS API ver. 1 registered Jan 21 00:57:34.082163 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Jan 21 00:57:34.083422 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 21 00:57:34.083000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:34.083000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:34.088961 kernel: PTP clock support registered Jan 21 00:57:34.087000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-update-utmp comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:34.087081 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Jan 21 00:57:34.094000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@ptp_kvm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:34.095000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@ptp_kvm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:34.092126 systemd[1]: modprobe@ptp_kvm.service: Deactivated successfully. Jan 21 00:57:34.092972 systemd[1]: Finished modprobe@ptp_kvm.service - Load Kernel Module ptp_kvm. Jan 21 00:57:34.096526 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 21 00:57:34.097512 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 21 00:57:34.103000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:34.103000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:34.104000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck@dev-disk-by\x2dlabel-OEM comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:34.104458 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Jan 21 00:57:34.114124 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 21 00:57:34.127130 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Jan 21 00:57:34.128000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-catalog-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:34.150000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=add_rule key=(null) list=5 res=1 Jan 21 00:57:34.150000 audit[1618]: SYSCALL arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7ffe8f5c30b0 a2=420 a3=0 items=0 ppid=1573 pid=1618 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:57:34.150000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 21 00:57:34.150953 augenrules[1618]: No rules Jan 21 00:57:34.152059 systemd[1]: audit-rules.service: Deactivated successfully. Jan 21 00:57:34.152790 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 21 00:57:34.162596 systemd-networkd[1587]: lo: Link UP Jan 21 00:57:34.162604 systemd-networkd[1587]: lo: Gained carrier Jan 21 00:57:34.164197 systemd-networkd[1587]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 21 00:57:34.164204 systemd-networkd[1587]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 21 00:57:34.164471 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 21 00:57:34.165712 systemd-networkd[1587]: eth0: Link UP Jan 21 00:57:34.165849 systemd-networkd[1587]: eth0: Gained carrier Jan 21 00:57:34.165863 systemd-networkd[1587]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 21 00:57:34.165966 systemd[1]: Reached target network.target - Network. Jan 21 00:57:34.169519 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Jan 21 00:57:34.172804 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Jan 21 00:57:34.177769 systemd-networkd[1587]: eth0: DHCPv4 address 10.0.5.74/25, gateway 10.0.5.1 acquired from 10.0.5.1 Jan 21 00:57:34.193551 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Jan 21 00:57:34.204280 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 21 00:57:34.232673 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Jan 21 00:57:34.233753 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Jan 21 00:57:34.791719 ldconfig[1581]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Jan 21 00:57:34.797441 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Jan 21 00:57:34.800553 systemd[1]: Starting systemd-update-done.service - Update is Completed... Jan 21 00:57:34.818435 systemd[1]: Finished systemd-update-done.service - Update is Completed. Jan 21 00:57:34.820378 systemd[1]: Reached target sysinit.target - System Initialization. Jan 21 00:57:34.821918 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Jan 21 00:57:34.822345 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Jan 21 00:57:34.822949 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Jan 21 00:57:34.824799 systemd[1]: Started logrotate.timer - Daily rotation of log files. Jan 21 00:57:34.825794 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Jan 21 00:57:34.826211 systemd[1]: Started systemd-sysupdate-reboot.timer - Reboot Automatically After System Update. Jan 21 00:57:34.826652 systemd[1]: Started systemd-sysupdate.timer - Automatic System Update. Jan 21 00:57:34.827208 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Jan 21 00:57:34.832453 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Jan 21 00:57:34.832497 systemd[1]: Reached target paths.target - Path Units. Jan 21 00:57:34.832930 systemd[1]: Reached target timers.target - Timer Units. Jan 21 00:57:34.841216 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Jan 21 00:57:34.843579 systemd[1]: Starting docker.socket - Docker Socket for the API... Jan 21 00:57:34.847810 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Jan 21 00:57:34.851670 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Jan 21 00:57:34.852090 systemd[1]: Reached target ssh-access.target - SSH Access Available. Jan 21 00:57:34.863654 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Jan 21 00:57:34.866479 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Jan 21 00:57:34.867742 systemd[1]: Listening on docker.socket - Docker Socket for the API. Jan 21 00:57:34.871439 systemd[1]: Reached target sockets.target - Socket Units. Jan 21 00:57:34.873181 systemd[1]: Reached target basic.target - Basic System. Jan 21 00:57:34.873606 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Jan 21 00:57:34.873634 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Jan 21 00:57:34.876699 systemd[1]: Starting chronyd.service - NTP client/server... Jan 21 00:57:34.881799 systemd[1]: Starting containerd.service - containerd container runtime... Jan 21 00:57:34.889586 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Jan 21 00:57:34.894859 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Jan 21 00:57:34.900882 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Jan 21 00:57:34.902606 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Jan 21 00:57:34.910933 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Jan 21 00:57:34.911487 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Jan 21 00:57:34.918725 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 21 00:57:34.919376 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Jan 21 00:57:34.925073 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Jan 21 00:57:34.933925 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Jan 21 00:57:34.939925 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Jan 21 00:57:34.946748 jq[1641]: false Jan 21 00:57:34.946958 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Jan 21 00:57:34.947216 chronyd[1636]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER -SIGND +NTS +SECHASH +IPV6 -DEBUG) Jan 21 00:57:34.948747 chronyd[1636]: Loaded seccomp filter (level 2) Jan 21 00:57:34.953672 systemd[1]: Starting systemd-logind.service - User Login Management... Jan 21 00:57:34.955724 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Jan 21 00:57:34.957144 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Jan 21 00:57:34.959854 extend-filesystems[1642]: Found /dev/vda6 Jan 21 00:57:34.962716 systemd[1]: Starting update-engine.service - Update Engine... Jan 21 00:57:34.968818 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Jan 21 00:57:34.970096 systemd[1]: Started chronyd.service - NTP client/server. Jan 21 00:57:34.978342 extend-filesystems[1642]: Found /dev/vda9 Jan 21 00:57:34.980505 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Jan 21 00:57:34.982090 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Jan 21 00:57:34.982318 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Jan 21 00:57:34.983406 extend-filesystems[1642]: Checking size of /dev/vda9 Jan 21 00:57:34.994351 google_oslogin_nss_cache[1643]: oslogin_cache_refresh[1643]: Refreshing passwd entry cache Jan 21 00:57:34.991191 oslogin_cache_refresh[1643]: Refreshing passwd entry cache Jan 21 00:57:35.001705 jq[1656]: true Jan 21 00:57:35.004517 google_oslogin_nss_cache[1643]: oslogin_cache_refresh[1643]: Failure getting users, quitting Jan 21 00:57:35.004511 oslogin_cache_refresh[1643]: Failure getting users, quitting Jan 21 00:57:35.004609 google_oslogin_nss_cache[1643]: oslogin_cache_refresh[1643]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Jan 21 00:57:35.004609 google_oslogin_nss_cache[1643]: oslogin_cache_refresh[1643]: Refreshing group entry cache Jan 21 00:57:35.004531 oslogin_cache_refresh[1643]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Jan 21 00:57:35.004591 oslogin_cache_refresh[1643]: Refreshing group entry cache Jan 21 00:57:35.011067 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Jan 21 00:57:35.011995 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Jan 21 00:57:35.013975 google_oslogin_nss_cache[1643]: oslogin_cache_refresh[1643]: Failure getting groups, quitting Jan 21 00:57:35.013975 google_oslogin_nss_cache[1643]: oslogin_cache_refresh[1643]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Jan 21 00:57:35.013968 oslogin_cache_refresh[1643]: Failure getting groups, quitting Jan 21 00:57:35.013982 oslogin_cache_refresh[1643]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Jan 21 00:57:35.018522 extend-filesystems[1642]: Resized partition /dev/vda9 Jan 21 00:57:35.019132 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Jan 21 00:57:35.021669 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Jan 21 00:57:35.025455 extend-filesystems[1681]: resize2fs 1.47.3 (8-Jul-2025) Jan 21 00:57:35.052707 kernel: EXT4-fs (vda9): resizing filesystem from 1617920 to 11516923 blocks Jan 21 00:57:35.053382 systemd[1]: motdgen.service: Deactivated successfully. Jan 21 00:57:35.053625 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Jan 21 00:57:35.056779 update_engine[1654]: I20260121 00:57:35.056032 1654 main.cc:92] Flatcar Update Engine starting Jan 21 00:57:35.069479 tar[1660]: linux-amd64/LICENSE Jan 21 00:57:35.071231 tar[1660]: linux-amd64/helm Jan 21 00:57:35.075636 jq[1671]: true Jan 21 00:57:35.116296 dbus-daemon[1639]: [system] SELinux support is enabled Jan 21 00:57:35.128972 systemd[1]: Started dbus.service - D-Bus System Message Bus. Jan 21 00:57:35.138373 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Jan 21 00:57:35.139103 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Jan 21 00:57:35.140738 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Jan 21 00:57:35.140762 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Jan 21 00:57:35.147829 systemd[1]: Started update-engine.service - Update Engine. Jan 21 00:57:35.149867 update_engine[1654]: I20260121 00:57:35.147672 1654 update_check_scheduler.cc:74] Next update check in 4m22s Jan 21 00:57:35.151832 systemd[1]: Started locksmithd.service - Cluster reboot manager. Jan 21 00:57:35.157582 systemd-logind[1653]: New seat seat0. Jan 21 00:57:35.160616 systemd-logind[1653]: Watching system buttons on /dev/input/event3 (Power Button) Jan 21 00:57:35.160632 systemd-logind[1653]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Jan 21 00:57:35.167758 systemd[1]: Started systemd-logind.service - User Login Management. Jan 21 00:57:35.301488 locksmithd[1710]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Jan 21 00:57:35.325160 sshd_keygen[1682]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Jan 21 00:57:35.348039 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Jan 21 00:57:35.352035 systemd[1]: Starting issuegen.service - Generate /run/issue... Jan 21 00:57:35.368004 systemd[1]: issuegen.service: Deactivated successfully. Jan 21 00:57:35.368424 systemd[1]: Finished issuegen.service - Generate /run/issue. Jan 21 00:57:35.374455 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Jan 21 00:57:35.391042 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Jan 21 00:57:35.396015 systemd[1]: Started getty@tty1.service - Getty on tty1. Jan 21 00:57:35.398968 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Jan 21 00:57:35.402557 systemd[1]: Reached target getty.target - Login Prompts. Jan 21 00:57:35.411218 containerd[1680]: time="2026-01-21T00:57:35Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Jan 21 00:57:35.427893 containerd[1680]: time="2026-01-21T00:57:35.427815479Z" level=info msg="starting containerd" revision=fcd43222d6b07379a4be9786bda52438f0dd16a1 version=v2.1.5 Jan 21 00:57:35.429609 bash[1709]: Updated "/home/core/.ssh/authorized_keys" Jan 21 00:57:35.433345 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Jan 21 00:57:35.441956 systemd[1]: Starting sshkeys.service... Jan 21 00:57:35.443789 containerd[1680]: time="2026-01-21T00:57:35.442806853Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="13.333µs" Jan 21 00:57:35.443789 containerd[1680]: time="2026-01-21T00:57:35.442891896Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Jan 21 00:57:35.443789 containerd[1680]: time="2026-01-21T00:57:35.442936036Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Jan 21 00:57:35.443789 containerd[1680]: time="2026-01-21T00:57:35.442951850Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Jan 21 00:57:35.443789 containerd[1680]: time="2026-01-21T00:57:35.443123506Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Jan 21 00:57:35.443789 containerd[1680]: time="2026-01-21T00:57:35.443137328Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jan 21 00:57:35.443789 containerd[1680]: time="2026-01-21T00:57:35.443190187Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jan 21 00:57:35.443789 containerd[1680]: time="2026-01-21T00:57:35.443201269Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jan 21 00:57:35.443789 containerd[1680]: time="2026-01-21T00:57:35.443430133Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jan 21 00:57:35.443789 containerd[1680]: time="2026-01-21T00:57:35.443442102Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jan 21 00:57:35.443789 containerd[1680]: time="2026-01-21T00:57:35.443452982Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jan 21 00:57:35.443789 containerd[1680]: time="2026-01-21T00:57:35.443461631Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.erofs type=io.containerd.snapshotter.v1 Jan 21 00:57:35.444018 containerd[1680]: time="2026-01-21T00:57:35.443606172Z" level=info msg="skip loading plugin" error="EROFS unsupported, please `modprobe erofs`: skip plugin" id=io.containerd.snapshotter.v1.erofs type=io.containerd.snapshotter.v1 Jan 21 00:57:35.444018 containerd[1680]: time="2026-01-21T00:57:35.443620623Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Jan 21 00:57:35.444018 containerd[1680]: time="2026-01-21T00:57:35.443708573Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Jan 21 00:57:35.444018 containerd[1680]: time="2026-01-21T00:57:35.443890829Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jan 21 00:57:35.444018 containerd[1680]: time="2026-01-21T00:57:35.443918285Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jan 21 00:57:35.444018 containerd[1680]: time="2026-01-21T00:57:35.443927322Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Jan 21 00:57:35.444018 containerd[1680]: time="2026-01-21T00:57:35.443960909Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Jan 21 00:57:35.444699 containerd[1680]: time="2026-01-21T00:57:35.444245449Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Jan 21 00:57:35.444699 containerd[1680]: time="2026-01-21T00:57:35.444319049Z" level=info msg="metadata content store policy set" policy=shared Jan 21 00:57:35.463954 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Jan 21 00:57:35.467571 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Jan 21 00:57:35.492758 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 21 00:57:35.497854 containerd[1680]: time="2026-01-21T00:57:35.497779658Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Jan 21 00:57:35.497854 containerd[1680]: time="2026-01-21T00:57:35.497849582Z" level=info msg="loading plugin" id=io.containerd.differ.v1.erofs type=io.containerd.differ.v1 Jan 21 00:57:35.498011 containerd[1680]: time="2026-01-21T00:57:35.497959015Z" level=info msg="skip loading plugin" error="could not find mkfs.erofs: exec: \"mkfs.erofs\": executable file not found in $PATH: skip plugin" id=io.containerd.differ.v1.erofs type=io.containerd.differ.v1 Jan 21 00:57:35.498011 containerd[1680]: time="2026-01-21T00:57:35.497976238Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Jan 21 00:57:35.498011 containerd[1680]: time="2026-01-21T00:57:35.497988729Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Jan 21 00:57:35.498011 containerd[1680]: time="2026-01-21T00:57:35.497999287Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Jan 21 00:57:35.498108 containerd[1680]: time="2026-01-21T00:57:35.498017669Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Jan 21 00:57:35.498108 containerd[1680]: time="2026-01-21T00:57:35.498028161Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Jan 21 00:57:35.498108 containerd[1680]: time="2026-01-21T00:57:35.498041169Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Jan 21 00:57:35.498108 containerd[1680]: time="2026-01-21T00:57:35.498052217Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Jan 21 00:57:35.498108 containerd[1680]: time="2026-01-21T00:57:35.498062950Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Jan 21 00:57:35.498108 containerd[1680]: time="2026-01-21T00:57:35.498072751Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Jan 21 00:57:35.498108 containerd[1680]: time="2026-01-21T00:57:35.498081922Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Jan 21 00:57:35.498108 containerd[1680]: time="2026-01-21T00:57:35.498093045Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Jan 21 00:57:35.498254 containerd[1680]: time="2026-01-21T00:57:35.498207045Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Jan 21 00:57:35.498254 containerd[1680]: time="2026-01-21T00:57:35.498224709Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Jan 21 00:57:35.498254 containerd[1680]: time="2026-01-21T00:57:35.498247308Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Jan 21 00:57:35.498314 containerd[1680]: time="2026-01-21T00:57:35.498257993Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Jan 21 00:57:35.498314 containerd[1680]: time="2026-01-21T00:57:35.498268838Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Jan 21 00:57:35.498314 containerd[1680]: time="2026-01-21T00:57:35.498277405Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Jan 21 00:57:35.498314 containerd[1680]: time="2026-01-21T00:57:35.498287204Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Jan 21 00:57:35.498314 containerd[1680]: time="2026-01-21T00:57:35.498296460Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Jan 21 00:57:35.498314 containerd[1680]: time="2026-01-21T00:57:35.498308905Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Jan 21 00:57:35.498417 containerd[1680]: time="2026-01-21T00:57:35.498318296Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Jan 21 00:57:35.498417 containerd[1680]: time="2026-01-21T00:57:35.498327451Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Jan 21 00:57:35.498417 containerd[1680]: time="2026-01-21T00:57:35.498347001Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Jan 21 00:57:35.498417 containerd[1680]: time="2026-01-21T00:57:35.498386643Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Jan 21 00:57:35.498417 containerd[1680]: time="2026-01-21T00:57:35.498398115Z" level=info msg="Start snapshots syncer" Jan 21 00:57:35.498417 containerd[1680]: time="2026-01-21T00:57:35.498414678Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Jan 21 00:57:35.499813 containerd[1680]: time="2026-01-21T00:57:35.499698196Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"cgroupWritable\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"\",\"binDirs\":[\"/opt/cni/bin\"],\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogLineSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Jan 21 00:57:35.499813 containerd[1680]: time="2026-01-21T00:57:35.499765921Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Jan 21 00:57:35.500051 containerd[1680]: time="2026-01-21T00:57:35.499834579Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Jan 21 00:57:35.500051 containerd[1680]: time="2026-01-21T00:57:35.499930268Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Jan 21 00:57:35.500051 containerd[1680]: time="2026-01-21T00:57:35.499949147Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Jan 21 00:57:35.500762 containerd[1680]: time="2026-01-21T00:57:35.500715044Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Jan 21 00:57:35.500762 containerd[1680]: time="2026-01-21T00:57:35.500752997Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Jan 21 00:57:35.500762 containerd[1680]: time="2026-01-21T00:57:35.500766326Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Jan 21 00:57:35.500887 containerd[1680]: time="2026-01-21T00:57:35.500777875Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Jan 21 00:57:35.500887 containerd[1680]: time="2026-01-21T00:57:35.500788133Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Jan 21 00:57:35.500887 containerd[1680]: time="2026-01-21T00:57:35.500797458Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Jan 21 00:57:35.500887 containerd[1680]: time="2026-01-21T00:57:35.500816293Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Jan 21 00:57:35.500887 containerd[1680]: time="2026-01-21T00:57:35.500854887Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jan 21 00:57:35.500887 containerd[1680]: time="2026-01-21T00:57:35.500868230Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jan 21 00:57:35.500887 containerd[1680]: time="2026-01-21T00:57:35.500875800Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jan 21 00:57:35.500887 containerd[1680]: time="2026-01-21T00:57:35.500884377Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jan 21 00:57:35.500887 containerd[1680]: time="2026-01-21T00:57:35.500892432Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Jan 21 00:57:35.501049 containerd[1680]: time="2026-01-21T00:57:35.500902144Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Jan 21 00:57:35.501049 containerd[1680]: time="2026-01-21T00:57:35.500911173Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Jan 21 00:57:35.501049 containerd[1680]: time="2026-01-21T00:57:35.500921634Z" level=info msg="runtime interface created" Jan 21 00:57:35.501049 containerd[1680]: time="2026-01-21T00:57:35.500927417Z" level=info msg="created NRI interface" Jan 21 00:57:35.501049 containerd[1680]: time="2026-01-21T00:57:35.500940395Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Jan 21 00:57:35.501049 containerd[1680]: time="2026-01-21T00:57:35.500954469Z" level=info msg="Connect containerd service" Jan 21 00:57:35.501049 containerd[1680]: time="2026-01-21T00:57:35.500974539Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Jan 21 00:57:35.503402 containerd[1680]: time="2026-01-21T00:57:35.503370747Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jan 21 00:57:35.603466 containerd[1680]: time="2026-01-21T00:57:35.603387567Z" level=info msg="Start subscribing containerd event" Jan 21 00:57:35.603719 containerd[1680]: time="2026-01-21T00:57:35.603581382Z" level=info msg="Start recovering state" Jan 21 00:57:35.603802 containerd[1680]: time="2026-01-21T00:57:35.603792401Z" level=info msg="Start event monitor" Jan 21 00:57:35.603837 containerd[1680]: time="2026-01-21T00:57:35.603831008Z" level=info msg="Start cni network conf syncer for default" Jan 21 00:57:35.604002 containerd[1680]: time="2026-01-21T00:57:35.603860461Z" level=info msg="Start streaming server" Jan 21 00:57:35.604002 containerd[1680]: time="2026-01-21T00:57:35.603870450Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Jan 21 00:57:35.604002 containerd[1680]: time="2026-01-21T00:57:35.603877020Z" level=info msg="runtime interface starting up..." Jan 21 00:57:35.604002 containerd[1680]: time="2026-01-21T00:57:35.603882141Z" level=info msg="starting plugins..." Jan 21 00:57:35.604002 containerd[1680]: time="2026-01-21T00:57:35.603892405Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Jan 21 00:57:35.604552 containerd[1680]: time="2026-01-21T00:57:35.604525223Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Jan 21 00:57:35.605614 containerd[1680]: time="2026-01-21T00:57:35.605598659Z" level=info msg=serving... address=/run/containerd/containerd.sock Jan 21 00:57:35.606566 containerd[1680]: time="2026-01-21T00:57:35.605731383Z" level=info msg="containerd successfully booted in 0.194854s" Jan 21 00:57:35.605885 systemd[1]: Started containerd.service - containerd container runtime. Jan 21 00:57:35.703703 kernel: EXT4-fs (vda9): resized filesystem to 11516923 Jan 21 00:57:35.741155 extend-filesystems[1681]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Jan 21 00:57:35.741155 extend-filesystems[1681]: old_desc_blocks = 1, new_desc_blocks = 6 Jan 21 00:57:35.741155 extend-filesystems[1681]: The filesystem on /dev/vda9 is now 11516923 (4k) blocks long. Jan 21 00:57:35.745379 extend-filesystems[1642]: Resized filesystem in /dev/vda9 Jan 21 00:57:35.742003 systemd[1]: extend-filesystems.service: Deactivated successfully. Jan 21 00:57:35.742236 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Jan 21 00:57:35.755824 systemd-networkd[1587]: eth0: Gained IPv6LL Jan 21 00:57:35.757784 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Jan 21 00:57:35.760414 systemd[1]: Reached target network-online.target - Network is Online. Jan 21 00:57:35.764926 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 21 00:57:35.767783 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Jan 21 00:57:35.803468 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Jan 21 00:57:35.815506 tar[1660]: linux-amd64/README.md Jan 21 00:57:35.834720 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Jan 21 00:57:35.958480 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 21 00:57:36.514722 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 21 00:57:37.128288 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 21 00:57:37.137146 (kubelet)[1779]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 21 00:57:37.927805 kubelet[1779]: E0121 00:57:37.927733 1779 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 21 00:57:37.931096 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 21 00:57:37.931295 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 21 00:57:37.931872 systemd[1]: kubelet.service: Consumed 992ms CPU time, 266M memory peak. Jan 21 00:57:37.968712 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 21 00:57:38.523728 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 21 00:57:39.748328 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Jan 21 00:57:39.755018 systemd[1]: Started sshd@0-10.0.5.74:22-4.153.228.146:43516.service - OpenSSH per-connection server daemon (4.153.228.146:43516). Jan 21 00:57:40.319771 sshd[1788]: Accepted publickey for core from 4.153.228.146 port 43516 ssh2: RSA SHA256:31xSJzLknPW0WXQ1/0HnwH4E7nhabzVasrxSAbB59go Jan 21 00:57:40.322298 sshd-session[1788]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 21 00:57:40.330630 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Jan 21 00:57:40.332130 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Jan 21 00:57:40.335531 systemd-logind[1653]: New session 1 of user core. Jan 21 00:57:40.356617 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Jan 21 00:57:40.359562 systemd[1]: Starting user@500.service - User Manager for UID 500... Jan 21 00:57:40.372649 (systemd)[1794]: pam_unix(systemd-user:session): session opened for user core(uid=500) by core(uid=0) Jan 21 00:57:40.375095 systemd-logind[1653]: New session 2 of user core. Jan 21 00:57:40.497052 systemd[1794]: Queued start job for default target default.target. Jan 21 00:57:40.513708 systemd[1794]: Created slice app.slice - User Application Slice. Jan 21 00:57:40.514183 systemd[1794]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of User's Temporary Directories. Jan 21 00:57:40.514253 systemd[1794]: Reached target paths.target - Paths. Jan 21 00:57:40.514302 systemd[1794]: Reached target timers.target - Timers. Jan 21 00:57:40.515546 systemd[1794]: Starting dbus.socket - D-Bus User Message Bus Socket... Jan 21 00:57:40.517490 systemd[1794]: Starting systemd-tmpfiles-setup.service - Create User Files and Directories... Jan 21 00:57:40.526865 systemd[1794]: Listening on dbus.socket - D-Bus User Message Bus Socket. Jan 21 00:57:40.527033 systemd[1794]: Reached target sockets.target - Sockets. Jan 21 00:57:40.528939 systemd[1794]: Finished systemd-tmpfiles-setup.service - Create User Files and Directories. Jan 21 00:57:40.528999 systemd[1794]: Reached target basic.target - Basic System. Jan 21 00:57:40.529040 systemd[1794]: Reached target default.target - Main User Target. Jan 21 00:57:40.529085 systemd[1794]: Startup finished in 148ms. Jan 21 00:57:40.529286 systemd[1]: Started user@500.service - User Manager for UID 500. Jan 21 00:57:40.532948 systemd[1]: Started session-1.scope - Session 1 of User core. Jan 21 00:57:40.849190 systemd[1]: Started sshd@1-10.0.5.74:22-4.153.228.146:43522.service - OpenSSH per-connection server daemon (4.153.228.146:43522). Jan 21 00:57:41.387397 sshd[1812]: Accepted publickey for core from 4.153.228.146 port 43522 ssh2: RSA SHA256:31xSJzLknPW0WXQ1/0HnwH4E7nhabzVasrxSAbB59go Jan 21 00:57:41.387906 sshd-session[1812]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 21 00:57:41.391804 systemd-logind[1653]: New session 3 of user core. Jan 21 00:57:41.402943 systemd[1]: Started session-3.scope - Session 3 of User core. Jan 21 00:57:41.691032 sshd[1816]: Connection closed by 4.153.228.146 port 43522 Jan 21 00:57:41.691598 sshd-session[1812]: pam_unix(sshd:session): session closed for user core Jan 21 00:57:41.696652 systemd-logind[1653]: Session 3 logged out. Waiting for processes to exit. Jan 21 00:57:41.697536 systemd[1]: sshd@1-10.0.5.74:22-4.153.228.146:43522.service: Deactivated successfully. Jan 21 00:57:41.699997 systemd[1]: session-3.scope: Deactivated successfully. Jan 21 00:57:41.702059 systemd-logind[1653]: Removed session 3. Jan 21 00:57:41.802670 systemd[1]: Started sshd@2-10.0.5.74:22-4.153.228.146:43524.service - OpenSSH per-connection server daemon (4.153.228.146:43524). Jan 21 00:57:41.978738 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 21 00:57:41.988711 coreos-metadata[1638]: Jan 21 00:57:41.987 WARN failed to locate config-drive, using the metadata service API instead Jan 21 00:57:42.003110 coreos-metadata[1638]: Jan 21 00:57:42.003 INFO Fetching http://169.254.169.254/openstack/2012-08-10/meta_data.json: Attempt #1 Jan 21 00:57:42.339119 coreos-metadata[1638]: Jan 21 00:57:42.338 INFO Fetch successful Jan 21 00:57:42.339280 coreos-metadata[1638]: Jan 21 00:57:42.339 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Jan 21 00:57:42.343007 sshd[1822]: Accepted publickey for core from 4.153.228.146 port 43524 ssh2: RSA SHA256:31xSJzLknPW0WXQ1/0HnwH4E7nhabzVasrxSAbB59go Jan 21 00:57:42.344823 sshd-session[1822]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 21 00:57:42.349585 systemd-logind[1653]: New session 4 of user core. Jan 21 00:57:42.358353 systemd[1]: Started session-4.scope - Session 4 of User core. Jan 21 00:57:42.480341 coreos-metadata[1638]: Jan 21 00:57:42.480 INFO Fetch successful Jan 21 00:57:42.480341 coreos-metadata[1638]: Jan 21 00:57:42.480 INFO Fetching http://169.254.169.254/latest/meta-data/instance-id: Attempt #1 Jan 21 00:57:42.537718 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 21 00:57:42.549930 coreos-metadata[1740]: Jan 21 00:57:42.549 WARN failed to locate config-drive, using the metadata service API instead Jan 21 00:57:42.562558 coreos-metadata[1740]: Jan 21 00:57:42.562 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 Jan 21 00:57:42.646941 sshd[1828]: Connection closed by 4.153.228.146 port 43524 Jan 21 00:57:42.647549 sshd-session[1822]: pam_unix(sshd:session): session closed for user core Jan 21 00:57:42.652385 systemd[1]: sshd@2-10.0.5.74:22-4.153.228.146:43524.service: Deactivated successfully. Jan 21 00:57:42.654550 systemd[1]: session-4.scope: Deactivated successfully. Jan 21 00:57:42.655662 systemd-logind[1653]: Session 4 logged out. Waiting for processes to exit. Jan 21 00:57:42.657074 systemd-logind[1653]: Removed session 4. Jan 21 00:57:44.270542 coreos-metadata[1638]: Jan 21 00:57:44.270 INFO Fetch successful Jan 21 00:57:44.270542 coreos-metadata[1638]: Jan 21 00:57:44.270 INFO Fetching http://169.254.169.254/latest/meta-data/instance-type: Attempt #1 Jan 21 00:57:44.273600 coreos-metadata[1740]: Jan 21 00:57:44.273 INFO Fetch successful Jan 21 00:57:44.273600 coreos-metadata[1740]: Jan 21 00:57:44.273 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 Jan 21 00:57:44.510380 coreos-metadata[1638]: Jan 21 00:57:44.510 INFO Fetch successful Jan 21 00:57:44.510380 coreos-metadata[1638]: Jan 21 00:57:44.510 INFO Fetching http://169.254.169.254/latest/meta-data/local-ipv4: Attempt #1 Jan 21 00:57:44.594753 coreos-metadata[1740]: Jan 21 00:57:44.594 INFO Fetch successful Jan 21 00:57:44.598387 unknown[1740]: wrote ssh authorized keys file for user: core Jan 21 00:57:44.623909 update-ssh-keys[1836]: Updated "/home/core/.ssh/authorized_keys" Jan 21 00:57:44.625513 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Jan 21 00:57:44.627209 systemd[1]: Finished sshkeys.service. Jan 21 00:57:44.633050 coreos-metadata[1638]: Jan 21 00:57:44.632 INFO Fetch successful Jan 21 00:57:44.633050 coreos-metadata[1638]: Jan 21 00:57:44.633 INFO Fetching http://169.254.169.254/latest/meta-data/public-ipv4: Attempt #1 Jan 21 00:57:44.920271 coreos-metadata[1638]: Jan 21 00:57:44.920 INFO Fetch successful Jan 21 00:57:44.947464 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Jan 21 00:57:44.948129 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Jan 21 00:57:44.948255 systemd[1]: Reached target multi-user.target - Multi-User System. Jan 21 00:57:44.950767 systemd[1]: Startup finished in 3.651s (kernel) + 12.922s (initrd) + 13.391s (userspace) = 29.966s. Jan 21 00:57:48.181840 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Jan 21 00:57:48.184072 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 21 00:57:48.338556 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 21 00:57:48.351112 (kubelet)[1852]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 21 00:57:48.388135 kubelet[1852]: E0121 00:57:48.388080 1852 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 21 00:57:48.391748 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 21 00:57:48.391977 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 21 00:57:48.392584 systemd[1]: kubelet.service: Consumed 155ms CPU time, 110.2M memory peak. Jan 21 00:57:52.769811 systemd[1]: Started sshd@3-10.0.5.74:22-4.153.228.146:57304.service - OpenSSH per-connection server daemon (4.153.228.146:57304). Jan 21 00:57:53.308810 sshd[1860]: Accepted publickey for core from 4.153.228.146 port 57304 ssh2: RSA SHA256:31xSJzLknPW0WXQ1/0HnwH4E7nhabzVasrxSAbB59go Jan 21 00:57:53.309915 sshd-session[1860]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 21 00:57:53.313797 systemd-logind[1653]: New session 5 of user core. Jan 21 00:57:53.323887 systemd[1]: Started session-5.scope - Session 5 of User core. Jan 21 00:57:53.611952 sshd[1864]: Connection closed by 4.153.228.146 port 57304 Jan 21 00:57:53.612385 sshd-session[1860]: pam_unix(sshd:session): session closed for user core Jan 21 00:57:53.616167 systemd-logind[1653]: Session 5 logged out. Waiting for processes to exit. Jan 21 00:57:53.616885 systemd[1]: sshd@3-10.0.5.74:22-4.153.228.146:57304.service: Deactivated successfully. Jan 21 00:57:53.618917 systemd[1]: session-5.scope: Deactivated successfully. Jan 21 00:57:53.620525 systemd-logind[1653]: Removed session 5. Jan 21 00:57:53.725874 systemd[1]: Started sshd@4-10.0.5.74:22-4.153.228.146:57320.service - OpenSSH per-connection server daemon (4.153.228.146:57320). Jan 21 00:57:54.268717 sshd[1870]: Accepted publickey for core from 4.153.228.146 port 57320 ssh2: RSA SHA256:31xSJzLknPW0WXQ1/0HnwH4E7nhabzVasrxSAbB59go Jan 21 00:57:54.269843 sshd-session[1870]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 21 00:57:54.273749 systemd-logind[1653]: New session 6 of user core. Jan 21 00:57:54.280989 systemd[1]: Started session-6.scope - Session 6 of User core. Jan 21 00:57:54.568333 sshd[1874]: Connection closed by 4.153.228.146 port 57320 Jan 21 00:57:54.568200 sshd-session[1870]: pam_unix(sshd:session): session closed for user core Jan 21 00:57:54.573493 systemd[1]: sshd@4-10.0.5.74:22-4.153.228.146:57320.service: Deactivated successfully. Jan 21 00:57:54.575224 systemd[1]: session-6.scope: Deactivated successfully. Jan 21 00:57:54.576111 systemd-logind[1653]: Session 6 logged out. Waiting for processes to exit. Jan 21 00:57:54.577426 systemd-logind[1653]: Removed session 6. Jan 21 00:57:54.677621 systemd[1]: Started sshd@5-10.0.5.74:22-4.153.228.146:47386.service - OpenSSH per-connection server daemon (4.153.228.146:47386). Jan 21 00:57:55.227416 sshd[1880]: Accepted publickey for core from 4.153.228.146 port 47386 ssh2: RSA SHA256:31xSJzLknPW0WXQ1/0HnwH4E7nhabzVasrxSAbB59go Jan 21 00:57:55.228904 sshd-session[1880]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 21 00:57:55.234109 systemd-logind[1653]: New session 7 of user core. Jan 21 00:57:55.239953 systemd[1]: Started session-7.scope - Session 7 of User core. Jan 21 00:57:55.531857 sshd[1884]: Connection closed by 4.153.228.146 port 47386 Jan 21 00:57:55.532455 sshd-session[1880]: pam_unix(sshd:session): session closed for user core Jan 21 00:57:55.536293 systemd[1]: sshd@5-10.0.5.74:22-4.153.228.146:47386.service: Deactivated successfully. Jan 21 00:57:55.538256 systemd[1]: session-7.scope: Deactivated successfully. Jan 21 00:57:55.541131 systemd-logind[1653]: Session 7 logged out. Waiting for processes to exit. Jan 21 00:57:55.541887 systemd-logind[1653]: Removed session 7. Jan 21 00:57:55.641513 systemd[1]: Started sshd@6-10.0.5.74:22-4.153.228.146:47396.service - OpenSSH per-connection server daemon (4.153.228.146:47396). Jan 21 00:57:56.190893 sshd[1890]: Accepted publickey for core from 4.153.228.146 port 47396 ssh2: RSA SHA256:31xSJzLknPW0WXQ1/0HnwH4E7nhabzVasrxSAbB59go Jan 21 00:57:56.192110 sshd-session[1890]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 21 00:57:56.196570 systemd-logind[1653]: New session 8 of user core. Jan 21 00:57:56.203937 systemd[1]: Started session-8.scope - Session 8 of User core. Jan 21 00:57:56.411970 sudo[1895]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Jan 21 00:57:56.412243 sudo[1895]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 21 00:57:56.428819 sudo[1895]: pam_unix(sudo:session): session closed for user root Jan 21 00:57:56.528450 sshd[1894]: Connection closed by 4.153.228.146 port 47396 Jan 21 00:57:56.529050 sshd-session[1890]: pam_unix(sshd:session): session closed for user core Jan 21 00:57:56.532523 systemd-logind[1653]: Session 8 logged out. Waiting for processes to exit. Jan 21 00:57:56.532666 systemd[1]: sshd@6-10.0.5.74:22-4.153.228.146:47396.service: Deactivated successfully. Jan 21 00:57:56.534153 systemd[1]: session-8.scope: Deactivated successfully. Jan 21 00:57:56.536262 systemd-logind[1653]: Removed session 8. Jan 21 00:57:56.637731 systemd[1]: Started sshd@7-10.0.5.74:22-4.153.228.146:47410.service - OpenSSH per-connection server daemon (4.153.228.146:47410). Jan 21 00:57:57.174721 sshd[1902]: Accepted publickey for core from 4.153.228.146 port 47410 ssh2: RSA SHA256:31xSJzLknPW0WXQ1/0HnwH4E7nhabzVasrxSAbB59go Jan 21 00:57:57.176159 sshd-session[1902]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 21 00:57:57.179932 systemd-logind[1653]: New session 9 of user core. Jan 21 00:57:57.193942 systemd[1]: Started session-9.scope - Session 9 of User core. Jan 21 00:57:57.380059 sudo[1908]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Jan 21 00:57:57.380320 sudo[1908]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 21 00:57:57.385726 sudo[1908]: pam_unix(sudo:session): session closed for user root Jan 21 00:57:57.391928 sudo[1907]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Jan 21 00:57:57.392184 sudo[1907]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 21 00:57:57.400470 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 21 00:57:57.438729 kernel: kauditd_printk_skb: 186 callbacks suppressed Jan 21 00:57:57.438806 kernel: audit: type=1305 audit(1768957077.436:232): auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Jan 21 00:57:57.436000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Jan 21 00:57:57.439705 augenrules[1932]: No rules Jan 21 00:57:57.436000 audit[1932]: SYSCALL arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7ffde9950d30 a2=420 a3=0 items=0 ppid=1913 pid=1932 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:57:57.442734 kernel: audit: type=1300 audit(1768957077.436:232): arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7ffde9950d30 a2=420 a3=0 items=0 ppid=1913 pid=1932 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:57:57.442922 systemd[1]: audit-rules.service: Deactivated successfully. Jan 21 00:57:57.443135 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 21 00:57:57.445893 sudo[1907]: pam_unix(sudo:session): session closed for user root Jan 21 00:57:57.436000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 21 00:57:57.442000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:57.453332 kernel: audit: type=1327 audit(1768957077.436:232): proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 21 00:57:57.453402 kernel: audit: type=1130 audit(1768957077.442:233): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:57.453432 kernel: audit: type=1131 audit(1768957077.442:234): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:57.442000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:57.445000 audit[1907]: USER_END pid=1907 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 21 00:57:57.445000 audit[1907]: CRED_DISP pid=1907 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 21 00:57:57.461058 kernel: audit: type=1106 audit(1768957077.445:235): pid=1907 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 21 00:57:57.461110 kernel: audit: type=1104 audit(1768957077.445:236): pid=1907 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 21 00:57:57.546051 sshd[1906]: Connection closed by 4.153.228.146 port 47410 Jan 21 00:57:57.545916 sshd-session[1902]: pam_unix(sshd:session): session closed for user core Jan 21 00:57:57.546000 audit[1902]: USER_END pid=1902 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 00:57:57.546000 audit[1902]: CRED_DISP pid=1902 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 00:57:57.551189 systemd[1]: sshd@7-10.0.5.74:22-4.153.228.146:47410.service: Deactivated successfully. Jan 21 00:57:57.553612 kernel: audit: type=1106 audit(1768957077.546:237): pid=1902 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 00:57:57.553653 kernel: audit: type=1104 audit(1768957077.546:238): pid=1902 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 00:57:57.550000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-10.0.5.74:22-4.153.228.146:47410 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:57.553897 systemd[1]: session-9.scope: Deactivated successfully. Jan 21 00:57:57.556515 systemd-logind[1653]: Session 9 logged out. Waiting for processes to exit. Jan 21 00:57:57.557002 kernel: audit: type=1131 audit(1768957077.550:239): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-10.0.5.74:22-4.153.228.146:47410 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:57.557582 systemd-logind[1653]: Removed session 9. Jan 21 00:57:57.656302 systemd[1]: Started sshd@8-10.0.5.74:22-4.153.228.146:47426.service - OpenSSH per-connection server daemon (4.153.228.146:47426). Jan 21 00:57:57.656000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-10.0.5.74:22-4.153.228.146:47426 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:58.192000 audit[1941]: USER_ACCT pid=1941 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 00:57:58.193335 sshd[1941]: Accepted publickey for core from 4.153.228.146 port 47426 ssh2: RSA SHA256:31xSJzLknPW0WXQ1/0HnwH4E7nhabzVasrxSAbB59go Jan 21 00:57:58.194000 audit[1941]: CRED_ACQ pid=1941 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 00:57:58.194000 audit[1941]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe66b49b80 a2=3 a3=0 items=0 ppid=1 pid=1941 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=10 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:57:58.194000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 21 00:57:58.195283 sshd-session[1941]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 21 00:57:58.199989 systemd-logind[1653]: New session 10 of user core. Jan 21 00:57:58.206906 systemd[1]: Started session-10.scope - Session 10 of User core. Jan 21 00:57:58.209000 audit[1941]: USER_START pid=1941 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 00:57:58.211000 audit[1945]: CRED_ACQ pid=1945 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 00:57:58.398000 audit[1946]: USER_ACCT pid=1946 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_unix,pam_faillock acct="core" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 21 00:57:58.399501 sudo[1946]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Jan 21 00:57:58.398000 audit[1946]: CRED_REFR pid=1946 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 21 00:57:58.399788 sudo[1946]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 21 00:57:58.399000 audit[1946]: USER_START pid=1946 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 21 00:57:58.642363 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Jan 21 00:57:58.645950 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 21 00:57:58.734082 chronyd[1636]: Selected source PHC0 Jan 21 00:57:59.536358 systemd[1]: Starting docker.service - Docker Application Container Engine... Jan 21 00:57:59.550390 (dockerd)[1969]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Jan 21 00:57:59.694995 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 21 00:57:59.693000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:57:59.707272 (kubelet)[1974]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 21 00:57:59.825364 kubelet[1974]: E0121 00:57:59.825137 1974 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 21 00:57:59.828068 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 21 00:57:59.828199 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 21 00:57:59.827000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 21 00:57:59.828771 systemd[1]: kubelet.service: Consumed 147ms CPU time, 108.7M memory peak. Jan 21 00:58:00.088523 dockerd[1969]: time="2026-01-21T00:58:00.087912659Z" level=info msg="Starting up" Jan 21 00:58:00.091080 dockerd[1969]: time="2026-01-21T00:58:00.091052139Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Jan 21 00:58:00.104089 dockerd[1969]: time="2026-01-21T00:58:00.104017761Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Jan 21 00:58:00.123071 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport1682705603-merged.mount: Deactivated successfully. Jan 21 00:58:00.152709 dockerd[1969]: time="2026-01-21T00:58:00.152514630Z" level=info msg="Loading containers: start." Jan 21 00:58:00.164711 kernel: Initializing XFRM netlink socket Jan 21 00:58:00.244000 audit[2029]: NETFILTER_CFG table=nat:2 family=2 entries=2 op=nft_register_chain pid=2029 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 21 00:58:00.244000 audit[2029]: SYSCALL arch=c000003e syscall=46 success=yes exit=116 a0=3 a1=7ffcfd562cb0 a2=0 a3=0 items=0 ppid=1969 pid=2029 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:00.244000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Jan 21 00:58:00.246000 audit[2031]: NETFILTER_CFG table=filter:3 family=2 entries=2 op=nft_register_chain pid=2031 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 21 00:58:00.246000 audit[2031]: SYSCALL arch=c000003e syscall=46 success=yes exit=124 a0=3 a1=7ffdd8ddcac0 a2=0 a3=0 items=0 ppid=1969 pid=2031 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:00.246000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Jan 21 00:58:00.249000 audit[2033]: NETFILTER_CFG table=filter:4 family=2 entries=1 op=nft_register_chain pid=2033 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 21 00:58:00.249000 audit[2033]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffeb338bce0 a2=0 a3=0 items=0 ppid=1969 pid=2033 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:00.249000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D464F5257415244 Jan 21 00:58:00.250000 audit[2035]: NETFILTER_CFG table=filter:5 family=2 entries=1 op=nft_register_chain pid=2035 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 21 00:58:00.250000 audit[2035]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffde378ff90 a2=0 a3=0 items=0 ppid=1969 pid=2035 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:00.250000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D425249444745 Jan 21 00:58:00.253000 audit[2037]: NETFILTER_CFG table=filter:6 family=2 entries=1 op=nft_register_chain pid=2037 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 21 00:58:00.253000 audit[2037]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffe335db880 a2=0 a3=0 items=0 ppid=1969 pid=2037 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:00.253000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D4354 Jan 21 00:58:00.258000 audit[2039]: NETFILTER_CFG table=filter:7 family=2 entries=1 op=nft_register_chain pid=2039 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 21 00:58:00.258000 audit[2039]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7ffd7460ce80 a2=0 a3=0 items=0 ppid=1969 pid=2039 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:00.258000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 21 00:58:00.260000 audit[2041]: NETFILTER_CFG table=filter:8 family=2 entries=1 op=nft_register_chain pid=2041 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 21 00:58:00.260000 audit[2041]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7ffd86760460 a2=0 a3=0 items=0 ppid=1969 pid=2041 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:00.260000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 21 00:58:00.262000 audit[2043]: NETFILTER_CFG table=nat:9 family=2 entries=2 op=nft_register_chain pid=2043 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 21 00:58:00.262000 audit[2043]: SYSCALL arch=c000003e syscall=46 success=yes exit=384 a0=3 a1=7fff75f765f0 a2=0 a3=0 items=0 ppid=1969 pid=2043 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:00.262000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Jan 21 00:58:00.297000 audit[2046]: NETFILTER_CFG table=nat:10 family=2 entries=2 op=nft_register_chain pid=2046 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 21 00:58:00.297000 audit[2046]: SYSCALL arch=c000003e syscall=46 success=yes exit=472 a0=3 a1=7ffe76965db0 a2=0 a3=0 items=0 ppid=1969 pid=2046 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:00.297000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003132372E302E302E302F38 Jan 21 00:58:00.299000 audit[2048]: NETFILTER_CFG table=filter:11 family=2 entries=2 op=nft_register_chain pid=2048 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 21 00:58:00.299000 audit[2048]: SYSCALL arch=c000003e syscall=46 success=yes exit=340 a0=3 a1=7ffe80783af0 a2=0 a3=0 items=0 ppid=1969 pid=2048 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:00.299000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D464F5257415244 Jan 21 00:58:00.301000 audit[2050]: NETFILTER_CFG table=filter:12 family=2 entries=1 op=nft_register_rule pid=2050 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 21 00:58:00.301000 audit[2050]: SYSCALL arch=c000003e syscall=46 success=yes exit=236 a0=3 a1=7fff8f5de290 a2=0 a3=0 items=0 ppid=1969 pid=2050 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:00.301000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D425249444745 Jan 21 00:58:00.302000 audit[2052]: NETFILTER_CFG table=filter:13 family=2 entries=1 op=nft_register_rule pid=2052 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 21 00:58:00.302000 audit[2052]: SYSCALL arch=c000003e syscall=46 success=yes exit=248 a0=3 a1=7fff9f7689a0 a2=0 a3=0 items=0 ppid=1969 pid=2052 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:00.302000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 21 00:58:00.304000 audit[2054]: NETFILTER_CFG table=filter:14 family=2 entries=1 op=nft_register_rule pid=2054 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 21 00:58:00.304000 audit[2054]: SYSCALL arch=c000003e syscall=46 success=yes exit=232 a0=3 a1=7ffffbed5700 a2=0 a3=0 items=0 ppid=1969 pid=2054 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:00.304000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D4354 Jan 21 00:58:00.348000 audit[2084]: NETFILTER_CFG table=nat:15 family=10 entries=2 op=nft_register_chain pid=2084 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 21 00:58:00.348000 audit[2084]: SYSCALL arch=c000003e syscall=46 success=yes exit=116 a0=3 a1=7ffe38df4700 a2=0 a3=0 items=0 ppid=1969 pid=2084 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:00.348000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Jan 21 00:58:00.351000 audit[2086]: NETFILTER_CFG table=filter:16 family=10 entries=2 op=nft_register_chain pid=2086 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 21 00:58:00.351000 audit[2086]: SYSCALL arch=c000003e syscall=46 success=yes exit=124 a0=3 a1=7ffe87481760 a2=0 a3=0 items=0 ppid=1969 pid=2086 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:00.351000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Jan 21 00:58:00.353000 audit[2088]: NETFILTER_CFG table=filter:17 family=10 entries=1 op=nft_register_chain pid=2088 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 21 00:58:00.353000 audit[2088]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fff3583c220 a2=0 a3=0 items=0 ppid=1969 pid=2088 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:00.353000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D464F5257415244 Jan 21 00:58:00.354000 audit[2090]: NETFILTER_CFG table=filter:18 family=10 entries=1 op=nft_register_chain pid=2090 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 21 00:58:00.354000 audit[2090]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffff597dad0 a2=0 a3=0 items=0 ppid=1969 pid=2090 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:00.354000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D425249444745 Jan 21 00:58:00.356000 audit[2092]: NETFILTER_CFG table=filter:19 family=10 entries=1 op=nft_register_chain pid=2092 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 21 00:58:00.356000 audit[2092]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffe96ddd0d0 a2=0 a3=0 items=0 ppid=1969 pid=2092 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:00.356000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D4354 Jan 21 00:58:00.359000 audit[2094]: NETFILTER_CFG table=filter:20 family=10 entries=1 op=nft_register_chain pid=2094 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 21 00:58:00.359000 audit[2094]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7ffcf3d685e0 a2=0 a3=0 items=0 ppid=1969 pid=2094 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:00.359000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 21 00:58:00.361000 audit[2096]: NETFILTER_CFG table=filter:21 family=10 entries=1 op=nft_register_chain pid=2096 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 21 00:58:00.361000 audit[2096]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7ffc4d2ea4c0 a2=0 a3=0 items=0 ppid=1969 pid=2096 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:00.361000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 21 00:58:00.363000 audit[2098]: NETFILTER_CFG table=nat:22 family=10 entries=2 op=nft_register_chain pid=2098 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 21 00:58:00.363000 audit[2098]: SYSCALL arch=c000003e syscall=46 success=yes exit=384 a0=3 a1=7ffe0c4b6f00 a2=0 a3=0 items=0 ppid=1969 pid=2098 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:00.363000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Jan 21 00:58:00.366000 audit[2100]: NETFILTER_CFG table=nat:23 family=10 entries=2 op=nft_register_chain pid=2100 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 21 00:58:00.366000 audit[2100]: SYSCALL arch=c000003e syscall=46 success=yes exit=484 a0=3 a1=7fff916bf770 a2=0 a3=0 items=0 ppid=1969 pid=2100 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:00.366000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003A3A312F313238 Jan 21 00:58:00.367000 audit[2102]: NETFILTER_CFG table=filter:24 family=10 entries=2 op=nft_register_chain pid=2102 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 21 00:58:00.367000 audit[2102]: SYSCALL arch=c000003e syscall=46 success=yes exit=340 a0=3 a1=7fff4591fc70 a2=0 a3=0 items=0 ppid=1969 pid=2102 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:00.367000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D464F5257415244 Jan 21 00:58:00.369000 audit[2104]: NETFILTER_CFG table=filter:25 family=10 entries=1 op=nft_register_rule pid=2104 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 21 00:58:00.369000 audit[2104]: SYSCALL arch=c000003e syscall=46 success=yes exit=236 a0=3 a1=7ffe68ff68f0 a2=0 a3=0 items=0 ppid=1969 pid=2104 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:00.369000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D425249444745 Jan 21 00:58:00.372000 audit[2106]: NETFILTER_CFG table=filter:26 family=10 entries=1 op=nft_register_rule pid=2106 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 21 00:58:00.372000 audit[2106]: SYSCALL arch=c000003e syscall=46 success=yes exit=248 a0=3 a1=7ffead493e20 a2=0 a3=0 items=0 ppid=1969 pid=2106 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:00.372000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 21 00:58:00.374000 audit[2108]: NETFILTER_CFG table=filter:27 family=10 entries=1 op=nft_register_rule pid=2108 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 21 00:58:00.374000 audit[2108]: SYSCALL arch=c000003e syscall=46 success=yes exit=232 a0=3 a1=7fffdf1d2790 a2=0 a3=0 items=0 ppid=1969 pid=2108 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:00.374000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D4354 Jan 21 00:58:00.378000 audit[2113]: NETFILTER_CFG table=filter:28 family=2 entries=1 op=nft_register_chain pid=2113 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 21 00:58:00.378000 audit[2113]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffe657b48e0 a2=0 a3=0 items=0 ppid=1969 pid=2113 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:00.378000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Jan 21 00:58:00.380000 audit[2115]: NETFILTER_CFG table=filter:29 family=2 entries=1 op=nft_register_rule pid=2115 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 21 00:58:00.380000 audit[2115]: SYSCALL arch=c000003e syscall=46 success=yes exit=212 a0=3 a1=7fff060cbc00 a2=0 a3=0 items=0 ppid=1969 pid=2115 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:00.380000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Jan 21 00:58:00.382000 audit[2117]: NETFILTER_CFG table=filter:30 family=2 entries=1 op=nft_register_rule pid=2117 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 21 00:58:00.382000 audit[2117]: SYSCALL arch=c000003e syscall=46 success=yes exit=224 a0=3 a1=7ffe94e79d80 a2=0 a3=0 items=0 ppid=1969 pid=2117 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:00.382000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Jan 21 00:58:00.385000 audit[2119]: NETFILTER_CFG table=filter:31 family=10 entries=1 op=nft_register_chain pid=2119 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 21 00:58:00.385000 audit[2119]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffed73e53c0 a2=0 a3=0 items=0 ppid=1969 pid=2119 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:00.385000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Jan 21 00:58:00.387000 audit[2121]: NETFILTER_CFG table=filter:32 family=10 entries=1 op=nft_register_rule pid=2121 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 21 00:58:00.387000 audit[2121]: SYSCALL arch=c000003e syscall=46 success=yes exit=212 a0=3 a1=7ffc131371f0 a2=0 a3=0 items=0 ppid=1969 pid=2121 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:00.387000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Jan 21 00:58:00.389000 audit[2123]: NETFILTER_CFG table=filter:33 family=10 entries=1 op=nft_register_rule pid=2123 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 21 00:58:00.389000 audit[2123]: SYSCALL arch=c000003e syscall=46 success=yes exit=224 a0=3 a1=7ffd010b1660 a2=0 a3=0 items=0 ppid=1969 pid=2123 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:00.389000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Jan 21 00:58:00.429000 audit[2128]: NETFILTER_CFG table=nat:34 family=2 entries=2 op=nft_register_chain pid=2128 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 21 00:58:00.429000 audit[2128]: SYSCALL arch=c000003e syscall=46 success=yes exit=520 a0=3 a1=7ffe7a983b10 a2=0 a3=0 items=0 ppid=1969 pid=2128 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:00.429000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4900504F5354524F5554494E47002D73003137322E31372E302E302F31360000002D6F00646F636B657230002D6A004D415351554552414445 Jan 21 00:58:00.431000 audit[2130]: NETFILTER_CFG table=nat:35 family=2 entries=1 op=nft_register_rule pid=2130 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 21 00:58:00.431000 audit[2130]: SYSCALL arch=c000003e syscall=46 success=yes exit=288 a0=3 a1=7fff455c53c0 a2=0 a3=0 items=0 ppid=1969 pid=2130 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:00.431000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4900444F434B4552002D6900646F636B657230002D6A0052455455524E Jan 21 00:58:00.440000 audit[2138]: NETFILTER_CFG table=filter:36 family=2 entries=1 op=nft_register_rule pid=2138 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 21 00:58:00.440000 audit[2138]: SYSCALL arch=c000003e syscall=46 success=yes exit=300 a0=3 a1=7ffcdeff99f0 a2=0 a3=0 items=0 ppid=1969 pid=2138 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:00.440000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D464F5257415244002D6900646F636B657230002D6A00414343455054 Jan 21 00:58:00.451000 audit[2144]: NETFILTER_CFG table=filter:37 family=2 entries=1 op=nft_register_rule pid=2144 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 21 00:58:00.451000 audit[2144]: SYSCALL arch=c000003e syscall=46 success=yes exit=376 a0=3 a1=7fff92b14c70 a2=0 a3=0 items=0 ppid=1969 pid=2144 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:00.451000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45520000002D6900646F636B657230002D6F00646F636B657230002D6A0044524F50 Jan 21 00:58:00.453000 audit[2146]: NETFILTER_CFG table=filter:38 family=2 entries=1 op=nft_register_rule pid=2146 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 21 00:58:00.453000 audit[2146]: SYSCALL arch=c000003e syscall=46 success=yes exit=512 a0=3 a1=7ffc8c22e450 a2=0 a3=0 items=0 ppid=1969 pid=2146 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:00.453000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D4354002D6F00646F636B657230002D6D00636F6E6E747261636B002D2D637473746174650052454C415445442C45535441424C4953484544002D6A00414343455054 Jan 21 00:58:00.455000 audit[2148]: NETFILTER_CFG table=filter:39 family=2 entries=1 op=nft_register_rule pid=2148 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 21 00:58:00.455000 audit[2148]: SYSCALL arch=c000003e syscall=46 success=yes exit=312 a0=3 a1=7ffdc772b350 a2=0 a3=0 items=0 ppid=1969 pid=2148 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:00.455000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D425249444745002D6F00646F636B657230002D6A00444F434B4552 Jan 21 00:58:00.457000 audit[2150]: NETFILTER_CFG table=filter:40 family=2 entries=1 op=nft_register_rule pid=2150 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 21 00:58:00.457000 audit[2150]: SYSCALL arch=c000003e syscall=46 success=yes exit=428 a0=3 a1=7ffd89cab7d0 a2=0 a3=0 items=0 ppid=1969 pid=2150 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:00.457000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D49534F4C4154494F4E2D53544147452D31002D6900646F636B6572300000002D6F00646F636B657230002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 21 00:58:00.458000 audit[2152]: NETFILTER_CFG table=filter:41 family=2 entries=1 op=nft_register_rule pid=2152 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 21 00:58:00.458000 audit[2152]: SYSCALL arch=c000003e syscall=46 success=yes exit=312 a0=3 a1=7ffe1504e320 a2=0 a3=0 items=0 ppid=1969 pid=2152 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:00.458000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4900444F434B45522D49534F4C4154494F4E2D53544147452D32002D6F00646F636B657230002D6A0044524F50 Jan 21 00:58:00.461274 systemd-networkd[1587]: docker0: Link UP Jan 21 00:58:00.469593 dockerd[1969]: time="2026-01-21T00:58:00.469517062Z" level=info msg="Loading containers: done." Jan 21 00:58:00.496157 dockerd[1969]: time="2026-01-21T00:58:00.496094992Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Jan 21 00:58:00.496326 dockerd[1969]: time="2026-01-21T00:58:00.496187632Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Jan 21 00:58:00.496326 dockerd[1969]: time="2026-01-21T00:58:00.496275021Z" level=info msg="Initializing buildkit" Jan 21 00:58:00.531668 dockerd[1969]: time="2026-01-21T00:58:00.531628118Z" level=info msg="Completed buildkit initialization" Jan 21 00:58:00.537586 dockerd[1969]: time="2026-01-21T00:58:00.537541543Z" level=info msg="Daemon has completed initialization" Jan 21 00:58:00.538328 dockerd[1969]: time="2026-01-21T00:58:00.537744807Z" level=info msg="API listen on /run/docker.sock" Jan 21 00:58:00.538273 systemd[1]: Started docker.service - Docker Application Container Engine. Jan 21 00:58:00.536000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=docker comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:58:01.904901 containerd[1680]: time="2026-01-21T00:58:01.904821603Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.11\"" Jan 21 00:58:02.603002 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2552598171.mount: Deactivated successfully. Jan 21 00:58:03.473279 containerd[1680]: time="2026-01-21T00:58:03.473199992Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.32.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 21 00:58:03.475429 containerd[1680]: time="2026-01-21T00:58:03.475237548Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.32.11: active requests=0, bytes read=27401903" Jan 21 00:58:03.477101 containerd[1680]: time="2026-01-21T00:58:03.477073197Z" level=info msg="ImageCreate event name:\"sha256:7757c58248a29fc7474a8072796848689852b0477adf16765f38b3d1a9bacadf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 21 00:58:03.479690 containerd[1680]: time="2026-01-21T00:58:03.479656705Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:41eaecaed9af0ca8ab36d7794819c7df199e68c6c6ee0649114d713c495f8bd5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 21 00:58:03.480386 containerd[1680]: time="2026-01-21T00:58:03.480363232Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.32.11\" with image id \"sha256:7757c58248a29fc7474a8072796848689852b0477adf16765f38b3d1a9bacadf\", repo tag \"registry.k8s.io/kube-apiserver:v1.32.11\", repo digest \"registry.k8s.io/kube-apiserver@sha256:41eaecaed9af0ca8ab36d7794819c7df199e68c6c6ee0649114d713c495f8bd5\", size \"29067246\" in 1.575506683s" Jan 21 00:58:03.480450 containerd[1680]: time="2026-01-21T00:58:03.480440364Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.11\" returns image reference \"sha256:7757c58248a29fc7474a8072796848689852b0477adf16765f38b3d1a9bacadf\"" Jan 21 00:58:03.480910 containerd[1680]: time="2026-01-21T00:58:03.480894832Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.11\"" Jan 21 00:58:04.923428 containerd[1680]: time="2026-01-21T00:58:04.923359603Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.32.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 21 00:58:04.925509 containerd[1680]: time="2026-01-21T00:58:04.925467397Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.32.11: active requests=0, bytes read=24985199" Jan 21 00:58:04.927483 containerd[1680]: time="2026-01-21T00:58:04.927322937Z" level=info msg="ImageCreate event name:\"sha256:0175d0a8243db520e3caa6d5c1e4248fddbc32447a9e8b5f4630831bc1e2489e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 21 00:58:04.931038 containerd[1680]: time="2026-01-21T00:58:04.930983328Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:ce7b2ead5eef1a1554ef28b2b79596c6a8c6d506a87a7ab1381e77fe3d72f55f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 21 00:58:04.932438 containerd[1680]: time="2026-01-21T00:58:04.932067062Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.32.11\" with image id \"sha256:0175d0a8243db520e3caa6d5c1e4248fddbc32447a9e8b5f4630831bc1e2489e\", repo tag \"registry.k8s.io/kube-controller-manager:v1.32.11\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:ce7b2ead5eef1a1554ef28b2b79596c6a8c6d506a87a7ab1381e77fe3d72f55f\", size \"26650388\" in 1.451092492s" Jan 21 00:58:04.932438 containerd[1680]: time="2026-01-21T00:58:04.932420283Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.11\" returns image reference \"sha256:0175d0a8243db520e3caa6d5c1e4248fddbc32447a9e8b5f4630831bc1e2489e\"" Jan 21 00:58:04.932843 containerd[1680]: time="2026-01-21T00:58:04.932801668Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.11\"" Jan 21 00:58:06.158080 containerd[1680]: time="2026-01-21T00:58:06.158020982Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.32.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 21 00:58:06.160416 containerd[1680]: time="2026-01-21T00:58:06.160384735Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.32.11: active requests=0, bytes read=19396939" Jan 21 00:58:06.161713 containerd[1680]: time="2026-01-21T00:58:06.161662780Z" level=info msg="ImageCreate event name:\"sha256:23d6a1fb92fda53b787f364351c610e55f073e8bdf0de5831974df7875b13f21\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 21 00:58:06.169938 containerd[1680]: time="2026-01-21T00:58:06.169876127Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:b3039587bbe70e61a6aeaff56c21fdeeef104524a31f835bcc80887d40b8e6b2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 21 00:58:06.170602 containerd[1680]: time="2026-01-21T00:58:06.170288709Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.32.11\" with image id \"sha256:23d6a1fb92fda53b787f364351c610e55f073e8bdf0de5831974df7875b13f21\", repo tag \"registry.k8s.io/kube-scheduler:v1.32.11\", repo digest \"registry.k8s.io/kube-scheduler@sha256:b3039587bbe70e61a6aeaff56c21fdeeef104524a31f835bcc80887d40b8e6b2\", size \"21062128\" in 1.237446591s" Jan 21 00:58:06.170602 containerd[1680]: time="2026-01-21T00:58:06.170317897Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.11\" returns image reference \"sha256:23d6a1fb92fda53b787f364351c610e55f073e8bdf0de5831974df7875b13f21\"" Jan 21 00:58:06.171185 containerd[1680]: time="2026-01-21T00:58:06.171171044Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.11\"" Jan 21 00:58:07.028452 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3827936765.mount: Deactivated successfully. Jan 21 00:58:08.013867 containerd[1680]: time="2026-01-21T00:58:08.013377486Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 21 00:58:08.015130 containerd[1680]: time="2026-01-21T00:58:08.015109846Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.11: active requests=0, bytes read=0" Jan 21 00:58:08.016662 containerd[1680]: time="2026-01-21T00:58:08.016636302Z" level=info msg="ImageCreate event name:\"sha256:4d8fb2dc5751966f058943ff7c5f10551e603d726ab8648c7c7b7f95a2663e3d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 21 00:58:08.018987 containerd[1680]: time="2026-01-21T00:58:08.018967046Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:4204f9136c23a867929d32046032fe069b49ad94cf168042405e7d0ec88bdba9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 21 00:58:08.019351 containerd[1680]: time="2026-01-21T00:58:08.019334345Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.11\" with image id \"sha256:4d8fb2dc5751966f058943ff7c5f10551e603d726ab8648c7c7b7f95a2663e3d\", repo tag \"registry.k8s.io/kube-proxy:v1.32.11\", repo digest \"registry.k8s.io/kube-proxy@sha256:4204f9136c23a867929d32046032fe069b49ad94cf168042405e7d0ec88bdba9\", size \"31160918\" in 1.848074935s" Jan 21 00:58:08.019420 containerd[1680]: time="2026-01-21T00:58:08.019410859Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.11\" returns image reference \"sha256:4d8fb2dc5751966f058943ff7c5f10551e603d726ab8648c7c7b7f95a2663e3d\"" Jan 21 00:58:08.019903 containerd[1680]: time="2026-01-21T00:58:08.019882841Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Jan 21 00:58:08.765678 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount615871462.mount: Deactivated successfully. Jan 21 00:58:09.427957 containerd[1680]: time="2026-01-21T00:58:09.427907274Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 21 00:58:09.429502 containerd[1680]: time="2026-01-21T00:58:09.429306097Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=17692010" Jan 21 00:58:09.431194 containerd[1680]: time="2026-01-21T00:58:09.431168665Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 21 00:58:09.434046 containerd[1680]: time="2026-01-21T00:58:09.434013882Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 21 00:58:09.434765 containerd[1680]: time="2026-01-21T00:58:09.434745347Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 1.414717986s" Jan 21 00:58:09.434835 containerd[1680]: time="2026-01-21T00:58:09.434822342Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" Jan 21 00:58:09.435305 containerd[1680]: time="2026-01-21T00:58:09.435279697Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Jan 21 00:58:09.961035 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Jan 21 00:58:09.963167 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 21 00:58:10.022139 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1381977277.mount: Deactivated successfully. Jan 21 00:58:10.042723 containerd[1680]: time="2026-01-21T00:58:10.041868387Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 21 00:58:10.046021 containerd[1680]: time="2026-01-21T00:58:10.045976063Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Jan 21 00:58:10.050350 containerd[1680]: time="2026-01-21T00:58:10.050293999Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 21 00:58:10.114567 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 21 00:58:10.114000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:58:10.116047 kernel: kauditd_printk_skb: 134 callbacks suppressed Jan 21 00:58:10.116107 kernel: audit: type=1130 audit(1768957090.114:292): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:58:10.130317 (kubelet)[2328]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 21 00:58:10.321203 kubelet[2328]: E0121 00:58:10.321057 2328 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 21 00:58:10.323567 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 21 00:58:10.323708 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 21 00:58:10.323000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 21 00:58:10.324075 systemd[1]: kubelet.service: Consumed 160ms CPU time, 110.2M memory peak. Jan 21 00:58:10.327755 kernel: audit: type=1131 audit(1768957090.323:293): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 21 00:58:10.554489 containerd[1680]: time="2026-01-21T00:58:10.554422046Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 21 00:58:10.555475 containerd[1680]: time="2026-01-21T00:58:10.555095877Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 1.119791481s" Jan 21 00:58:10.555475 containerd[1680]: time="2026-01-21T00:58:10.555125919Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Jan 21 00:58:10.556065 containerd[1680]: time="2026-01-21T00:58:10.556045829Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\"" Jan 21 00:58:11.347467 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2112707622.mount: Deactivated successfully. Jan 21 00:58:14.021472 containerd[1680]: time="2026-01-21T00:58:14.021418305Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.16-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 21 00:58:14.022960 containerd[1680]: time="2026-01-21T00:58:14.022727741Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.16-0: active requests=0, bytes read=45502580" Jan 21 00:58:14.024557 containerd[1680]: time="2026-01-21T00:58:14.024526089Z" level=info msg="ImageCreate event name:\"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 21 00:58:14.029556 containerd[1680]: time="2026-01-21T00:58:14.029505551Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 21 00:58:14.033535 containerd[1680]: time="2026-01-21T00:58:14.033071191Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.16-0\" with image id \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\", repo tag \"registry.k8s.io/etcd:3.5.16-0\", repo digest \"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\", size \"57680541\" in 3.476927347s" Jan 21 00:58:14.033535 containerd[1680]: time="2026-01-21T00:58:14.033107237Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\" returns image reference \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\"" Jan 21 00:58:16.334710 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 21 00:58:16.334858 systemd[1]: kubelet.service: Consumed 160ms CPU time, 110.2M memory peak. Jan 21 00:58:16.334000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:58:16.343081 kernel: audit: type=1130 audit(1768957096.334:294): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:58:16.343159 kernel: audit: type=1131 audit(1768957096.334:295): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:58:16.334000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:58:16.341094 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 21 00:58:16.369919 systemd[1]: Reload requested from client PID 2416 ('systemctl') (unit session-10.scope)... Jan 21 00:58:16.369934 systemd[1]: Reloading... Jan 21 00:58:16.485704 zram_generator::config[2465]: No configuration found. Jan 21 00:58:16.668243 systemd[1]: Reloading finished in 297 ms. Jan 21 00:58:16.701242 kernel: audit: type=1334 audit(1768957096.698:296): prog-id=63 op=LOAD Jan 21 00:58:16.701332 kernel: audit: type=1334 audit(1768957096.698:297): prog-id=64 op=LOAD Jan 21 00:58:16.698000 audit: BPF prog-id=63 op=LOAD Jan 21 00:58:16.698000 audit: BPF prog-id=64 op=LOAD Jan 21 00:58:16.698000 audit: BPF prog-id=47 op=UNLOAD Jan 21 00:58:16.704690 kernel: audit: type=1334 audit(1768957096.698:298): prog-id=47 op=UNLOAD Jan 21 00:58:16.698000 audit: BPF prog-id=48 op=UNLOAD Jan 21 00:58:16.698000 audit: BPF prog-id=65 op=LOAD Jan 21 00:58:16.706985 kernel: audit: type=1334 audit(1768957096.698:299): prog-id=48 op=UNLOAD Jan 21 00:58:16.707019 kernel: audit: type=1334 audit(1768957096.698:300): prog-id=65 op=LOAD Jan 21 00:58:16.698000 audit: BPF prog-id=59 op=UNLOAD Jan 21 00:58:16.708189 kernel: audit: type=1334 audit(1768957096.698:301): prog-id=59 op=UNLOAD Jan 21 00:58:16.699000 audit: BPF prog-id=66 op=LOAD Jan 21 00:58:16.709980 kernel: audit: type=1334 audit(1768957096.699:302): prog-id=66 op=LOAD Jan 21 00:58:16.710007 kernel: audit: type=1334 audit(1768957096.699:303): prog-id=43 op=UNLOAD Jan 21 00:58:16.699000 audit: BPF prog-id=43 op=UNLOAD Jan 21 00:58:16.699000 audit: BPF prog-id=67 op=LOAD Jan 21 00:58:16.699000 audit: BPF prog-id=52 op=UNLOAD Jan 21 00:58:16.699000 audit: BPF prog-id=68 op=LOAD Jan 21 00:58:16.699000 audit: BPF prog-id=69 op=LOAD Jan 21 00:58:16.699000 audit: BPF prog-id=53 op=UNLOAD Jan 21 00:58:16.699000 audit: BPF prog-id=54 op=UNLOAD Jan 21 00:58:16.701000 audit: BPF prog-id=70 op=LOAD Jan 21 00:58:16.701000 audit: BPF prog-id=49 op=UNLOAD Jan 21 00:58:16.701000 audit: BPF prog-id=71 op=LOAD Jan 21 00:58:16.701000 audit: BPF prog-id=72 op=LOAD Jan 21 00:58:16.701000 audit: BPF prog-id=50 op=UNLOAD Jan 21 00:58:16.701000 audit: BPF prog-id=51 op=UNLOAD Jan 21 00:58:16.703000 audit: BPF prog-id=73 op=LOAD Jan 21 00:58:16.703000 audit: BPF prog-id=44 op=UNLOAD Jan 21 00:58:16.703000 audit: BPF prog-id=74 op=LOAD Jan 21 00:58:16.703000 audit: BPF prog-id=75 op=LOAD Jan 21 00:58:16.703000 audit: BPF prog-id=45 op=UNLOAD Jan 21 00:58:16.703000 audit: BPF prog-id=46 op=UNLOAD Jan 21 00:58:16.704000 audit: BPF prog-id=76 op=LOAD Jan 21 00:58:16.704000 audit: BPF prog-id=60 op=UNLOAD Jan 21 00:58:16.704000 audit: BPF prog-id=77 op=LOAD Jan 21 00:58:16.704000 audit: BPF prog-id=78 op=LOAD Jan 21 00:58:16.704000 audit: BPF prog-id=61 op=UNLOAD Jan 21 00:58:16.704000 audit: BPF prog-id=62 op=UNLOAD Jan 21 00:58:16.704000 audit: BPF prog-id=79 op=LOAD Jan 21 00:58:16.704000 audit: BPF prog-id=55 op=UNLOAD Jan 21 00:58:16.704000 audit: BPF prog-id=80 op=LOAD Jan 21 00:58:16.704000 audit: BPF prog-id=81 op=LOAD Jan 21 00:58:16.704000 audit: BPF prog-id=56 op=UNLOAD Jan 21 00:58:16.704000 audit: BPF prog-id=57 op=UNLOAD Jan 21 00:58:16.712000 audit: BPF prog-id=82 op=LOAD Jan 21 00:58:16.712000 audit: BPF prog-id=58 op=UNLOAD Jan 21 00:58:16.727208 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Jan 21 00:58:16.727284 systemd[1]: kubelet.service: Failed with result 'signal'. Jan 21 00:58:16.727561 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 21 00:58:16.727615 systemd[1]: kubelet.service: Consumed 98ms CPU time, 98.5M memory peak. Jan 21 00:58:16.726000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 21 00:58:16.729155 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 21 00:58:17.461389 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 21 00:58:17.460000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:58:17.472990 (kubelet)[2516]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 21 00:58:17.860885 kubelet[2516]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 21 00:58:17.860885 kubelet[2516]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Jan 21 00:58:17.860885 kubelet[2516]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 21 00:58:17.861698 kubelet[2516]: I0121 00:58:17.861262 2516 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 21 00:58:18.078988 kubelet[2516]: I0121 00:58:18.078258 2516 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Jan 21 00:58:18.078988 kubelet[2516]: I0121 00:58:18.078286 2516 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 21 00:58:18.078988 kubelet[2516]: I0121 00:58:18.078523 2516 server.go:954] "Client rotation is on, will bootstrap in background" Jan 21 00:58:18.143783 kubelet[2516]: E0121 00:58:18.142854 2516 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.0.5.74:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.5.74:6443: connect: connection refused" logger="UnhandledError" Jan 21 00:58:18.144412 kubelet[2516]: I0121 00:58:18.144390 2516 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 21 00:58:18.172044 kubelet[2516]: I0121 00:58:18.172017 2516 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 21 00:58:18.175121 kubelet[2516]: I0121 00:58:18.175067 2516 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jan 21 00:58:18.251735 kubelet[2516]: I0121 00:58:18.251641 2516 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 21 00:58:18.251932 kubelet[2516]: I0121 00:58:18.251722 2516 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4547-0-0-n-af1f1f5a24","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 21 00:58:18.252036 kubelet[2516]: I0121 00:58:18.251937 2516 topology_manager.go:138] "Creating topology manager with none policy" Jan 21 00:58:18.252036 kubelet[2516]: I0121 00:58:18.251947 2516 container_manager_linux.go:304] "Creating device plugin manager" Jan 21 00:58:18.252082 kubelet[2516]: I0121 00:58:18.252071 2516 state_mem.go:36] "Initialized new in-memory state store" Jan 21 00:58:18.440291 kubelet[2516]: I0121 00:58:18.440172 2516 kubelet.go:446] "Attempting to sync node with API server" Jan 21 00:58:18.443548 kubelet[2516]: I0121 00:58:18.443513 2516 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 21 00:58:18.443630 kubelet[2516]: I0121 00:58:18.443571 2516 kubelet.go:352] "Adding apiserver pod source" Jan 21 00:58:18.443630 kubelet[2516]: I0121 00:58:18.443584 2516 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 21 00:58:18.448600 kubelet[2516]: W0121 00:58:18.448552 2516 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.5.74:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4547-0-0-n-af1f1f5a24&limit=500&resourceVersion=0": dial tcp 10.0.5.74:6443: connect: connection refused Jan 21 00:58:18.448710 kubelet[2516]: E0121 00:58:18.448606 2516 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.0.5.74:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4547-0-0-n-af1f1f5a24&limit=500&resourceVersion=0\": dial tcp 10.0.5.74:6443: connect: connection refused" logger="UnhandledError" Jan 21 00:58:18.448710 kubelet[2516]: W0121 00:58:18.449238 2516 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.5.74:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.0.5.74:6443: connect: connection refused Jan 21 00:58:18.448710 kubelet[2516]: E0121 00:58:18.449274 2516 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.0.5.74:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.5.74:6443: connect: connection refused" logger="UnhandledError" Jan 21 00:58:18.448710 kubelet[2516]: I0121 00:58:18.449647 2516 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.1.5" apiVersion="v1" Jan 21 00:58:18.450908 kubelet[2516]: I0121 00:58:18.450886 2516 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 21 00:58:18.462129 kubelet[2516]: W0121 00:58:18.462081 2516 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Jan 21 00:58:18.471331 kubelet[2516]: I0121 00:58:18.471296 2516 watchdog_linux.go:99] "Systemd watchdog is not enabled" Jan 21 00:58:18.471331 kubelet[2516]: I0121 00:58:18.471337 2516 server.go:1287] "Started kubelet" Jan 21 00:58:18.471709 kubelet[2516]: I0121 00:58:18.471482 2516 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Jan 21 00:58:18.472346 kubelet[2516]: I0121 00:58:18.472334 2516 server.go:479] "Adding debug handlers to kubelet server" Jan 21 00:58:18.478304 kubelet[2516]: I0121 00:58:18.478281 2516 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 21 00:58:18.480723 kubelet[2516]: I0121 00:58:18.480553 2516 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 21 00:58:18.481022 kubelet[2516]: I0121 00:58:18.481010 2516 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 21 00:58:18.481000 audit[2528]: NETFILTER_CFG table=mangle:42 family=2 entries=2 op=nft_register_chain pid=2528 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 21 00:58:18.481000 audit[2528]: SYSCALL arch=c000003e syscall=46 success=yes exit=136 a0=3 a1=7ffeebed5a40 a2=0 a3=0 items=0 ppid=2516 pid=2528 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:18.481000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Jan 21 00:58:18.483040 kubelet[2516]: E0121 00:58:18.481244 2516 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.5.74:6443/api/v1/namespaces/default/events\": dial tcp 10.0.5.74:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4547-0-0-n-af1f1f5a24.188c991bd549d1e9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4547-0-0-n-af1f1f5a24,UID:ci-4547-0-0-n-af1f1f5a24,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4547-0-0-n-af1f1f5a24,},FirstTimestamp:2026-01-21 00:58:18.471313897 +0000 UTC m=+0.995280246,LastTimestamp:2026-01-21 00:58:18.471313897 +0000 UTC m=+0.995280246,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4547-0-0-n-af1f1f5a24,}" Jan 21 00:58:18.484000 audit[2529]: NETFILTER_CFG table=filter:43 family=2 entries=1 op=nft_register_chain pid=2529 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 21 00:58:18.484000 audit[2529]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffc89ece9d0 a2=0 a3=0 items=0 ppid=2516 pid=2529 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:18.484000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4649524557414C4C002D740066696C746572 Jan 21 00:58:18.486451 kubelet[2516]: I0121 00:58:18.486416 2516 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jan 21 00:58:18.488995 kubelet[2516]: I0121 00:58:18.488967 2516 volume_manager.go:297] "Starting Kubelet Volume Manager" Jan 21 00:58:18.489203 kubelet[2516]: E0121 00:58:18.489183 2516 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4547-0-0-n-af1f1f5a24\" not found" Jan 21 00:58:18.489000 audit[2531]: NETFILTER_CFG table=filter:44 family=2 entries=2 op=nft_register_chain pid=2531 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 21 00:58:18.489000 audit[2531]: SYSCALL arch=c000003e syscall=46 success=yes exit=340 a0=3 a1=7ffdd486ddf0 a2=0 a3=0 items=0 ppid=2516 pid=2531 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:18.489000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 21 00:58:18.492187 kubelet[2516]: I0121 00:58:18.492136 2516 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Jan 21 00:58:18.492187 kubelet[2516]: I0121 00:58:18.492188 2516 reconciler.go:26] "Reconciler: start to sync state" Jan 21 00:58:18.492000 audit[2533]: NETFILTER_CFG table=filter:45 family=2 entries=2 op=nft_register_chain pid=2533 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 21 00:58:18.492000 audit[2533]: SYSCALL arch=c000003e syscall=46 success=yes exit=340 a0=3 a1=7ffe9d1f27a0 a2=0 a3=0 items=0 ppid=2516 pid=2533 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:18.492000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 21 00:58:18.505802 kubelet[2516]: E0121 00:58:18.505753 2516 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.5.74:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4547-0-0-n-af1f1f5a24?timeout=10s\": dial tcp 10.0.5.74:6443: connect: connection refused" interval="200ms" Jan 21 00:58:18.506334 kubelet[2516]: W0121 00:58:18.506064 2516 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.5.74:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.5.74:6443: connect: connection refused Jan 21 00:58:18.506334 kubelet[2516]: E0121 00:58:18.506305 2516 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.0.5.74:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.5.74:6443: connect: connection refused" logger="UnhandledError" Jan 21 00:58:18.507616 kubelet[2516]: I0121 00:58:18.507594 2516 factory.go:221] Registration of the systemd container factory successfully Jan 21 00:58:18.507774 kubelet[2516]: I0121 00:58:18.507754 2516 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 21 00:58:18.514202 kubelet[2516]: I0121 00:58:18.514168 2516 factory.go:221] Registration of the containerd container factory successfully Jan 21 00:58:18.515070 kubelet[2516]: E0121 00:58:18.515038 2516 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jan 21 00:58:18.531629 kubelet[2516]: I0121 00:58:18.531591 2516 cpu_manager.go:221] "Starting CPU manager" policy="none" Jan 21 00:58:18.531629 kubelet[2516]: I0121 00:58:18.531604 2516 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Jan 21 00:58:18.531982 kubelet[2516]: I0121 00:58:18.531798 2516 state_mem.go:36] "Initialized new in-memory state store" Jan 21 00:58:18.590136 kubelet[2516]: E0121 00:58:18.590088 2516 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4547-0-0-n-af1f1f5a24\" not found" Jan 21 00:58:18.656000 audit[2539]: NETFILTER_CFG table=filter:46 family=2 entries=1 op=nft_register_rule pid=2539 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 21 00:58:18.656000 audit[2539]: SYSCALL arch=c000003e syscall=46 success=yes exit=924 a0=3 a1=7ffd3d555db0 a2=0 a3=0 items=0 ppid=2516 pid=2539 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:18.656000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D41004B5542452D4649524557414C4C002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E7400626C6F636B20696E636F6D696E67206C6F63616C6E657420636F6E6E656374696F6E73002D2D647374003132372E302E302E302F38 Jan 21 00:58:18.657000 audit[2542]: NETFILTER_CFG table=mangle:47 family=2 entries=1 op=nft_register_chain pid=2542 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 21 00:58:18.657000 audit[2542]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffe42e39e10 a2=0 a3=0 items=0 ppid=2516 pid=2542 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:18.657000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Jan 21 00:58:18.658000 audit[2541]: NETFILTER_CFG table=mangle:48 family=10 entries=2 op=nft_register_chain pid=2541 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 21 00:58:18.658000 audit[2541]: SYSCALL arch=c000003e syscall=46 success=yes exit=136 a0=3 a1=7ffededb9e10 a2=0 a3=0 items=0 ppid=2516 pid=2541 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:18.658000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Jan 21 00:58:18.658000 audit[2543]: NETFILTER_CFG table=nat:49 family=2 entries=1 op=nft_register_chain pid=2543 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 21 00:58:18.658000 audit[2543]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffda81082d0 a2=0 a3=0 items=0 ppid=2516 pid=2543 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:18.658000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Jan 21 00:58:18.660000 audit[2545]: NETFILTER_CFG table=mangle:50 family=10 entries=1 op=nft_register_chain pid=2545 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 21 00:58:18.660000 audit[2545]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7fff1e69bba0 a2=0 a3=0 items=0 ppid=2516 pid=2545 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:18.660000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Jan 21 00:58:18.661000 audit[2544]: NETFILTER_CFG table=filter:51 family=2 entries=1 op=nft_register_chain pid=2544 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 21 00:58:18.661000 audit[2544]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffca8cb19e0 a2=0 a3=0 items=0 ppid=2516 pid=2544 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:18.661000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Jan 21 00:58:18.661000 audit[2546]: NETFILTER_CFG table=nat:52 family=10 entries=1 op=nft_register_chain pid=2546 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 21 00:58:18.661000 audit[2546]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffebf2a3bd0 a2=0 a3=0 items=0 ppid=2516 pid=2546 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:18.661000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Jan 21 00:58:18.662000 audit[2547]: NETFILTER_CFG table=filter:53 family=10 entries=1 op=nft_register_chain pid=2547 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 21 00:58:18.662000 audit[2547]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffed13ba300 a2=0 a3=0 items=0 ppid=2516 pid=2547 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:18.662000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Jan 21 00:58:18.757947 kubelet[2516]: I0121 00:58:18.657162 2516 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 21 00:58:18.757947 kubelet[2516]: I0121 00:58:18.659442 2516 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 21 00:58:18.757947 kubelet[2516]: I0121 00:58:18.659466 2516 status_manager.go:227] "Starting to sync pod status with apiserver" Jan 21 00:58:18.757947 kubelet[2516]: I0121 00:58:18.659484 2516 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Jan 21 00:58:18.757947 kubelet[2516]: I0121 00:58:18.659491 2516 kubelet.go:2382] "Starting kubelet main sync loop" Jan 21 00:58:18.757947 kubelet[2516]: E0121 00:58:18.659539 2516 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 21 00:58:18.757947 kubelet[2516]: W0121 00:58:18.661592 2516 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.5.74:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.5.74:6443: connect: connection refused Jan 21 00:58:18.757947 kubelet[2516]: E0121 00:58:18.661751 2516 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.0.5.74:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.5.74:6443: connect: connection refused" logger="UnhandledError" Jan 21 00:58:18.757947 kubelet[2516]: E0121 00:58:18.690769 2516 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4547-0-0-n-af1f1f5a24\" not found" Jan 21 00:58:18.757947 kubelet[2516]: E0121 00:58:18.707632 2516 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.5.74:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4547-0-0-n-af1f1f5a24?timeout=10s\": dial tcp 10.0.5.74:6443: connect: connection refused" interval="400ms" Jan 21 00:58:18.760869 kubelet[2516]: E0121 00:58:18.759775 2516 kubelet.go:2406] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Jan 21 00:58:18.760869 kubelet[2516]: I0121 00:58:18.759913 2516 policy_none.go:49] "None policy: Start" Jan 21 00:58:18.760869 kubelet[2516]: I0121 00:58:18.759929 2516 memory_manager.go:186] "Starting memorymanager" policy="None" Jan 21 00:58:18.760869 kubelet[2516]: I0121 00:58:18.759941 2516 state_mem.go:35] "Initializing new in-memory state store" Jan 21 00:58:18.791585 kubelet[2516]: E0121 00:58:18.791500 2516 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4547-0-0-n-af1f1f5a24\" not found" Jan 21 00:58:18.892177 kubelet[2516]: E0121 00:58:18.892098 2516 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4547-0-0-n-af1f1f5a24\" not found" Jan 21 00:58:18.960789 kubelet[2516]: E0121 00:58:18.960725 2516 kubelet.go:2406] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Jan 21 00:58:18.992467 kubelet[2516]: E0121 00:58:18.992409 2516 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4547-0-0-n-af1f1f5a24\" not found" Jan 21 00:58:19.093350 kubelet[2516]: E0121 00:58:19.093217 2516 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4547-0-0-n-af1f1f5a24\" not found" Jan 21 00:58:19.108989 kubelet[2516]: E0121 00:58:19.108952 2516 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.5.74:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4547-0-0-n-af1f1f5a24?timeout=10s\": dial tcp 10.0.5.74:6443: connect: connection refused" interval="800ms" Jan 21 00:58:19.193639 kubelet[2516]: E0121 00:58:19.193575 2516 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4547-0-0-n-af1f1f5a24\" not found" Jan 21 00:58:19.294363 kubelet[2516]: E0121 00:58:19.294295 2516 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4547-0-0-n-af1f1f5a24\" not found" Jan 21 00:58:19.361831 kubelet[2516]: E0121 00:58:19.361704 2516 kubelet.go:2406] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Jan 21 00:58:19.395348 kubelet[2516]: E0121 00:58:19.395279 2516 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4547-0-0-n-af1f1f5a24\" not found" Jan 21 00:58:19.496112 kubelet[2516]: E0121 00:58:19.495915 2516 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4547-0-0-n-af1f1f5a24\" not found" Jan 21 00:58:19.548763 kubelet[2516]: W0121 00:58:19.548724 2516 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.5.74:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.5.74:6443: connect: connection refused Jan 21 00:58:19.548763 kubelet[2516]: E0121 00:58:19.548770 2516 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.0.5.74:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.5.74:6443: connect: connection refused" logger="UnhandledError" Jan 21 00:58:19.596629 kubelet[2516]: E0121 00:58:19.596580 2516 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4547-0-0-n-af1f1f5a24\" not found" Jan 21 00:58:19.900949 kubelet[2516]: E0121 00:58:19.697535 2516 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4547-0-0-n-af1f1f5a24\" not found" Jan 21 00:58:19.900949 kubelet[2516]: E0121 00:58:19.798441 2516 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4547-0-0-n-af1f1f5a24\" not found" Jan 21 00:58:19.900949 kubelet[2516]: W0121 00:58:19.840257 2516 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.5.74:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4547-0-0-n-af1f1f5a24&limit=500&resourceVersion=0": dial tcp 10.0.5.74:6443: connect: connection refused Jan 21 00:58:19.900949 kubelet[2516]: E0121 00:58:19.840380 2516 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.0.5.74:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4547-0-0-n-af1f1f5a24&limit=500&resourceVersion=0\": dial tcp 10.0.5.74:6443: connect: connection refused" logger="UnhandledError" Jan 21 00:58:19.900949 kubelet[2516]: E0121 00:58:19.899131 2516 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4547-0-0-n-af1f1f5a24\" not found" Jan 21 00:58:19.905940 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Jan 21 00:58:19.909896 kubelet[2516]: E0121 00:58:19.909874 2516 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.5.74:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4547-0-0-n-af1f1f5a24?timeout=10s\": dial tcp 10.0.5.74:6443: connect: connection refused" interval="1.6s" Jan 21 00:58:19.915311 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Jan 21 00:58:19.925861 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Jan 21 00:58:19.927699 kubelet[2516]: I0121 00:58:19.927031 2516 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 21 00:58:19.927699 kubelet[2516]: I0121 00:58:19.927179 2516 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 21 00:58:19.927699 kubelet[2516]: I0121 00:58:19.927188 2516 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 21 00:58:19.927699 kubelet[2516]: I0121 00:58:19.927661 2516 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 21 00:58:19.928660 kubelet[2516]: E0121 00:58:19.928648 2516 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Jan 21 00:58:19.928832 kubelet[2516]: E0121 00:58:19.928823 2516 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4547-0-0-n-af1f1f5a24\" not found" Jan 21 00:58:19.984919 update_engine[1654]: I20260121 00:58:19.984855 1654 update_attempter.cc:509] Updating boot flags... Jan 21 00:58:20.028627 kubelet[2516]: W0121 00:58:20.028543 2516 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.5.74:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.0.5.74:6443: connect: connection refused Jan 21 00:58:20.028936 kubelet[2516]: E0121 00:58:20.028907 2516 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.0.5.74:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.5.74:6443: connect: connection refused" logger="UnhandledError" Jan 21 00:58:20.029881 kubelet[2516]: I0121 00:58:20.029852 2516 kubelet_node_status.go:75] "Attempting to register node" node="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:58:20.032699 kubelet[2516]: E0121 00:58:20.032588 2516 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.5.74:6443/api/v1/nodes\": dial tcp 10.0.5.74:6443: connect: connection refused" node="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:58:20.133554 kubelet[2516]: W0121 00:58:20.133485 2516 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.5.74:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.5.74:6443: connect: connection refused Jan 21 00:58:20.133554 kubelet[2516]: E0121 00:58:20.133549 2516 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.0.5.74:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.5.74:6443: connect: connection refused" logger="UnhandledError" Jan 21 00:58:20.179857 kubelet[2516]: E0121 00:58:20.179560 2516 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.0.5.74:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.5.74:6443: connect: connection refused" logger="UnhandledError" Jan 21 00:58:20.186415 systemd[1]: Created slice kubepods-burstable-pode977b4b2c6a0e29608872340f5eef491.slice - libcontainer container kubepods-burstable-pode977b4b2c6a0e29608872340f5eef491.slice. Jan 21 00:58:20.209522 kubelet[2516]: I0121 00:58:20.207999 2516 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/3d02a747eede2de7b982876e4c5c8d07-flexvolume-dir\") pod \"kube-controller-manager-ci-4547-0-0-n-af1f1f5a24\" (UID: \"3d02a747eede2de7b982876e4c5c8d07\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-n-af1f1f5a24" Jan 21 00:58:20.209622 kubelet[2516]: I0121 00:58:20.209545 2516 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/3d02a747eede2de7b982876e4c5c8d07-k8s-certs\") pod \"kube-controller-manager-ci-4547-0-0-n-af1f1f5a24\" (UID: \"3d02a747eede2de7b982876e4c5c8d07\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-n-af1f1f5a24" Jan 21 00:58:20.209622 kubelet[2516]: I0121 00:58:20.209569 2516 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/691b10bb0b58b5aa64e93e5d2dcfd029-kubeconfig\") pod \"kube-scheduler-ci-4547-0-0-n-af1f1f5a24\" (UID: \"691b10bb0b58b5aa64e93e5d2dcfd029\") " pod="kube-system/kube-scheduler-ci-4547-0-0-n-af1f1f5a24" Jan 21 00:58:20.209622 kubelet[2516]: I0121 00:58:20.209584 2516 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/3d02a747eede2de7b982876e4c5c8d07-ca-certs\") pod \"kube-controller-manager-ci-4547-0-0-n-af1f1f5a24\" (UID: \"3d02a747eede2de7b982876e4c5c8d07\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-n-af1f1f5a24" Jan 21 00:58:20.209622 kubelet[2516]: I0121 00:58:20.209598 2516 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/3d02a747eede2de7b982876e4c5c8d07-kubeconfig\") pod \"kube-controller-manager-ci-4547-0-0-n-af1f1f5a24\" (UID: \"3d02a747eede2de7b982876e4c5c8d07\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-n-af1f1f5a24" Jan 21 00:58:20.209622 kubelet[2516]: I0121 00:58:20.209613 2516 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/3d02a747eede2de7b982876e4c5c8d07-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4547-0-0-n-af1f1f5a24\" (UID: \"3d02a747eede2de7b982876e4c5c8d07\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-n-af1f1f5a24" Jan 21 00:58:20.209733 kubelet[2516]: I0121 00:58:20.209639 2516 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/e977b4b2c6a0e29608872340f5eef491-ca-certs\") pod \"kube-apiserver-ci-4547-0-0-n-af1f1f5a24\" (UID: \"e977b4b2c6a0e29608872340f5eef491\") " pod="kube-system/kube-apiserver-ci-4547-0-0-n-af1f1f5a24" Jan 21 00:58:20.209733 kubelet[2516]: I0121 00:58:20.209651 2516 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/e977b4b2c6a0e29608872340f5eef491-k8s-certs\") pod \"kube-apiserver-ci-4547-0-0-n-af1f1f5a24\" (UID: \"e977b4b2c6a0e29608872340f5eef491\") " pod="kube-system/kube-apiserver-ci-4547-0-0-n-af1f1f5a24" Jan 21 00:58:20.209733 kubelet[2516]: I0121 00:58:20.209665 2516 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/e977b4b2c6a0e29608872340f5eef491-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4547-0-0-n-af1f1f5a24\" (UID: \"e977b4b2c6a0e29608872340f5eef491\") " pod="kube-system/kube-apiserver-ci-4547-0-0-n-af1f1f5a24" Jan 21 00:58:20.210818 kubelet[2516]: E0121 00:58:20.210798 2516 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-n-af1f1f5a24\" not found" node="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:58:20.231324 systemd[1]: Created slice kubepods-burstable-pod3d02a747eede2de7b982876e4c5c8d07.slice - libcontainer container kubepods-burstable-pod3d02a747eede2de7b982876e4c5c8d07.slice. Jan 21 00:58:20.238672 kubelet[2516]: I0121 00:58:20.238646 2516 kubelet_node_status.go:75] "Attempting to register node" node="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:58:20.241132 kubelet[2516]: E0121 00:58:20.241107 2516 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.5.74:6443/api/v1/nodes\": dial tcp 10.0.5.74:6443: connect: connection refused" node="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:58:20.250757 kubelet[2516]: E0121 00:58:20.250731 2516 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-n-af1f1f5a24\" not found" node="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:58:20.280692 systemd[1]: Created slice kubepods-burstable-pod691b10bb0b58b5aa64e93e5d2dcfd029.slice - libcontainer container kubepods-burstable-pod691b10bb0b58b5aa64e93e5d2dcfd029.slice. Jan 21 00:58:20.300779 kubelet[2516]: E0121 00:58:20.300705 2516 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-n-af1f1f5a24\" not found" node="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:58:20.512288 containerd[1680]: time="2026-01-21T00:58:20.512060503Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4547-0-0-n-af1f1f5a24,Uid:e977b4b2c6a0e29608872340f5eef491,Namespace:kube-system,Attempt:0,}" Jan 21 00:58:20.545879 containerd[1680]: time="2026-01-21T00:58:20.545832397Z" level=info msg="connecting to shim fd36a39c80be25b52bcb5485dd9cfe1a608e6d1246b98d6763aa99cdaaac2d45" address="unix:///run/containerd/s/6ae0c80bd0912be6d5b9b8292c444e2993f8603b07bb667ebfd9f6eced0c9145" namespace=k8s.io protocol=ttrpc version=3 Jan 21 00:58:20.547144 kubelet[2516]: E0121 00:58:20.547062 2516 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.5.74:6443/api/v1/namespaces/default/events\": dial tcp 10.0.5.74:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4547-0-0-n-af1f1f5a24.188c991bd549d1e9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4547-0-0-n-af1f1f5a24,UID:ci-4547-0-0-n-af1f1f5a24,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4547-0-0-n-af1f1f5a24,},FirstTimestamp:2026-01-21 00:58:18.471313897 +0000 UTC m=+0.995280246,LastTimestamp:2026-01-21 00:58:18.471313897 +0000 UTC m=+0.995280246,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4547-0-0-n-af1f1f5a24,}" Jan 21 00:58:20.552706 containerd[1680]: time="2026-01-21T00:58:20.552543520Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4547-0-0-n-af1f1f5a24,Uid:3d02a747eede2de7b982876e4c5c8d07,Namespace:kube-system,Attempt:0,}" Jan 21 00:58:20.569892 systemd[1]: Started cri-containerd-fd36a39c80be25b52bcb5485dd9cfe1a608e6d1246b98d6763aa99cdaaac2d45.scope - libcontainer container fd36a39c80be25b52bcb5485dd9cfe1a608e6d1246b98d6763aa99cdaaac2d45. Jan 21 00:58:20.579000 audit: BPF prog-id=83 op=LOAD Jan 21 00:58:20.580000 audit: BPF prog-id=84 op=LOAD Jan 21 00:58:20.580000 audit[2586]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=2575 pid=2586 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:20.580000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6664333661333963383062653235623532626362353438356464396366 Jan 21 00:58:20.580000 audit: BPF prog-id=84 op=UNLOAD Jan 21 00:58:20.580000 audit[2586]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2575 pid=2586 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:20.580000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6664333661333963383062653235623532626362353438356464396366 Jan 21 00:58:20.580000 audit: BPF prog-id=85 op=LOAD Jan 21 00:58:20.580000 audit[2586]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=2575 pid=2586 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:20.580000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6664333661333963383062653235623532626362353438356464396366 Jan 21 00:58:20.581000 audit: BPF prog-id=86 op=LOAD Jan 21 00:58:20.581000 audit[2586]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=2575 pid=2586 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:20.581000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6664333661333963383062653235623532626362353438356464396366 Jan 21 00:58:20.581000 audit: BPF prog-id=86 op=UNLOAD Jan 21 00:58:20.581000 audit[2586]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2575 pid=2586 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:20.581000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6664333661333963383062653235623532626362353438356464396366 Jan 21 00:58:20.581000 audit: BPF prog-id=85 op=UNLOAD Jan 21 00:58:20.581000 audit[2586]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2575 pid=2586 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:20.581000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6664333661333963383062653235623532626362353438356464396366 Jan 21 00:58:20.581000 audit: BPF prog-id=87 op=LOAD Jan 21 00:58:20.581000 audit[2586]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=2575 pid=2586 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:20.581000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6664333661333963383062653235623532626362353438356464396366 Jan 21 00:58:20.595462 containerd[1680]: time="2026-01-21T00:58:20.595423781Z" level=info msg="connecting to shim 8981a065df3ff0c296c8061cf4d99c5946d106fb652bb66b69e710d26864c08f" address="unix:///run/containerd/s/8d8d74b139111924092190e0a48d8853d1c95fbc706f7151d9830eb10c18ac48" namespace=k8s.io protocol=ttrpc version=3 Jan 21 00:58:20.602943 containerd[1680]: time="2026-01-21T00:58:20.602896050Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4547-0-0-n-af1f1f5a24,Uid:691b10bb0b58b5aa64e93e5d2dcfd029,Namespace:kube-system,Attempt:0,}" Jan 21 00:58:20.623885 systemd[1]: Started cri-containerd-8981a065df3ff0c296c8061cf4d99c5946d106fb652bb66b69e710d26864c08f.scope - libcontainer container 8981a065df3ff0c296c8061cf4d99c5946d106fb652bb66b69e710d26864c08f. Jan 21 00:58:20.640616 containerd[1680]: time="2026-01-21T00:58:20.640584013Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4547-0-0-n-af1f1f5a24,Uid:e977b4b2c6a0e29608872340f5eef491,Namespace:kube-system,Attempt:0,} returns sandbox id \"fd36a39c80be25b52bcb5485dd9cfe1a608e6d1246b98d6763aa99cdaaac2d45\"" Jan 21 00:58:20.646000 audit: BPF prog-id=88 op=LOAD Jan 21 00:58:20.648000 audit: BPF prog-id=89 op=LOAD Jan 21 00:58:20.648000 audit[2624]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=2613 pid=2624 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:20.650970 containerd[1680]: time="2026-01-21T00:58:20.650808191Z" level=info msg="connecting to shim 2e9492921b1836aa6f8752b0782eb00f351006e704223894420a9608a3501416" address="unix:///run/containerd/s/f016eb9e19f88219bbe73ae3611c579f01419a2f69cf4ae1a48acb5c18763823" namespace=k8s.io protocol=ttrpc version=3 Jan 21 00:58:20.648000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3839383161303635646633666630633239366338303631636634643939 Jan 21 00:58:20.651000 audit: BPF prog-id=89 op=UNLOAD Jan 21 00:58:20.651000 audit[2624]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2613 pid=2624 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:20.651000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3839383161303635646633666630633239366338303631636634643939 Jan 21 00:58:20.651000 audit: BPF prog-id=90 op=LOAD Jan 21 00:58:20.651000 audit[2624]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=2613 pid=2624 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:20.651000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3839383161303635646633666630633239366338303631636634643939 Jan 21 00:58:20.652788 kubelet[2516]: I0121 00:58:20.652707 2516 kubelet_node_status.go:75] "Attempting to register node" node="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:58:20.653079 kubelet[2516]: E0121 00:58:20.653045 2516 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.5.74:6443/api/v1/nodes\": dial tcp 10.0.5.74:6443: connect: connection refused" node="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:58:20.652000 audit: BPF prog-id=91 op=LOAD Jan 21 00:58:20.653354 containerd[1680]: time="2026-01-21T00:58:20.653336496Z" level=info msg="CreateContainer within sandbox \"fd36a39c80be25b52bcb5485dd9cfe1a608e6d1246b98d6763aa99cdaaac2d45\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Jan 21 00:58:20.652000 audit[2624]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=2613 pid=2624 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:20.652000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3839383161303635646633666630633239366338303631636634643939 Jan 21 00:58:20.652000 audit: BPF prog-id=91 op=UNLOAD Jan 21 00:58:20.652000 audit[2624]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2613 pid=2624 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:20.652000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3839383161303635646633666630633239366338303631636634643939 Jan 21 00:58:20.652000 audit: BPF prog-id=90 op=UNLOAD Jan 21 00:58:20.652000 audit[2624]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2613 pid=2624 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:20.652000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3839383161303635646633666630633239366338303631636634643939 Jan 21 00:58:20.653000 audit: BPF prog-id=92 op=LOAD Jan 21 00:58:20.653000 audit[2624]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=2613 pid=2624 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:20.653000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3839383161303635646633666630633239366338303631636634643939 Jan 21 00:58:20.665658 containerd[1680]: time="2026-01-21T00:58:20.665632898Z" level=info msg="Container 3c06048357c65e0e42c404ea0fdfee78d76de0b1c936e855c052be812088f8af: CDI devices from CRI Config.CDIDevices: []" Jan 21 00:58:20.676037 containerd[1680]: time="2026-01-21T00:58:20.676004061Z" level=info msg="CreateContainer within sandbox \"fd36a39c80be25b52bcb5485dd9cfe1a608e6d1246b98d6763aa99cdaaac2d45\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"3c06048357c65e0e42c404ea0fdfee78d76de0b1c936e855c052be812088f8af\"" Jan 21 00:58:20.677914 containerd[1680]: time="2026-01-21T00:58:20.677891345Z" level=info msg="StartContainer for \"3c06048357c65e0e42c404ea0fdfee78d76de0b1c936e855c052be812088f8af\"" Jan 21 00:58:20.678055 systemd[1]: Started cri-containerd-2e9492921b1836aa6f8752b0782eb00f351006e704223894420a9608a3501416.scope - libcontainer container 2e9492921b1836aa6f8752b0782eb00f351006e704223894420a9608a3501416. Jan 21 00:58:20.679131 containerd[1680]: time="2026-01-21T00:58:20.679108056Z" level=info msg="connecting to shim 3c06048357c65e0e42c404ea0fdfee78d76de0b1c936e855c052be812088f8af" address="unix:///run/containerd/s/6ae0c80bd0912be6d5b9b8292c444e2993f8603b07bb667ebfd9f6eced0c9145" protocol=ttrpc version=3 Jan 21 00:58:20.699170 systemd[1]: Started cri-containerd-3c06048357c65e0e42c404ea0fdfee78d76de0b1c936e855c052be812088f8af.scope - libcontainer container 3c06048357c65e0e42c404ea0fdfee78d76de0b1c936e855c052be812088f8af. Jan 21 00:58:20.702000 audit: BPF prog-id=93 op=LOAD Jan 21 00:58:20.704000 audit: BPF prog-id=94 op=LOAD Jan 21 00:58:20.704000 audit[2671]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b0238 a2=98 a3=0 items=0 ppid=2658 pid=2671 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:20.704000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3265393439323932316231383336616136663837353262303738326562 Jan 21 00:58:20.704000 audit: BPF prog-id=94 op=UNLOAD Jan 21 00:58:20.704000 audit[2671]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2658 pid=2671 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:20.704000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3265393439323932316231383336616136663837353262303738326562 Jan 21 00:58:20.704000 audit: BPF prog-id=95 op=LOAD Jan 21 00:58:20.704000 audit[2671]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b0488 a2=98 a3=0 items=0 ppid=2658 pid=2671 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:20.704000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3265393439323932316231383336616136663837353262303738326562 Jan 21 00:58:20.704000 audit: BPF prog-id=96 op=LOAD Jan 21 00:58:20.704000 audit[2671]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c0001b0218 a2=98 a3=0 items=0 ppid=2658 pid=2671 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:20.704000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3265393439323932316231383336616136663837353262303738326562 Jan 21 00:58:20.704000 audit: BPF prog-id=96 op=UNLOAD Jan 21 00:58:20.704000 audit[2671]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2658 pid=2671 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:20.704000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3265393439323932316231383336616136663837353262303738326562 Jan 21 00:58:20.704000 audit: BPF prog-id=95 op=UNLOAD Jan 21 00:58:20.704000 audit[2671]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2658 pid=2671 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:20.704000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3265393439323932316231383336616136663837353262303738326562 Jan 21 00:58:20.704000 audit: BPF prog-id=97 op=LOAD Jan 21 00:58:20.704000 audit[2671]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b06e8 a2=98 a3=0 items=0 ppid=2658 pid=2671 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:20.704000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3265393439323932316231383336616136663837353262303738326562 Jan 21 00:58:20.717631 containerd[1680]: time="2026-01-21T00:58:20.717556810Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4547-0-0-n-af1f1f5a24,Uid:3d02a747eede2de7b982876e4c5c8d07,Namespace:kube-system,Attempt:0,} returns sandbox id \"8981a065df3ff0c296c8061cf4d99c5946d106fb652bb66b69e710d26864c08f\"" Jan 21 00:58:20.718000 audit: BPF prog-id=98 op=LOAD Jan 21 00:58:20.719000 audit: BPF prog-id=99 op=LOAD Jan 21 00:58:20.719000 audit[2683]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=2575 pid=2683 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:20.719000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3363303630343833353763363565306534326334303465613066646665 Jan 21 00:58:20.719000 audit: BPF prog-id=99 op=UNLOAD Jan 21 00:58:20.719000 audit[2683]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2575 pid=2683 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:20.719000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3363303630343833353763363565306534326334303465613066646665 Jan 21 00:58:20.720000 audit: BPF prog-id=100 op=LOAD Jan 21 00:58:20.720000 audit[2683]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=2575 pid=2683 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:20.720000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3363303630343833353763363565306534326334303465613066646665 Jan 21 00:58:20.720000 audit: BPF prog-id=101 op=LOAD Jan 21 00:58:20.720000 audit[2683]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=2575 pid=2683 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:20.720000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3363303630343833353763363565306534326334303465613066646665 Jan 21 00:58:20.720000 audit: BPF prog-id=101 op=UNLOAD Jan 21 00:58:20.720000 audit[2683]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2575 pid=2683 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:20.720000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3363303630343833353763363565306534326334303465613066646665 Jan 21 00:58:20.720000 audit: BPF prog-id=100 op=UNLOAD Jan 21 00:58:20.720000 audit[2683]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2575 pid=2683 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:20.720000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3363303630343833353763363565306534326334303465613066646665 Jan 21 00:58:20.720000 audit: BPF prog-id=102 op=LOAD Jan 21 00:58:20.720000 audit[2683]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=2575 pid=2683 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:20.720000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3363303630343833353763363565306534326334303465613066646665 Jan 21 00:58:20.722167 containerd[1680]: time="2026-01-21T00:58:20.721854524Z" level=info msg="CreateContainer within sandbox \"8981a065df3ff0c296c8061cf4d99c5946d106fb652bb66b69e710d26864c08f\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Jan 21 00:58:20.736613 containerd[1680]: time="2026-01-21T00:58:20.736582229Z" level=info msg="Container df42269d3da3a55b5b1f41fd48f6fee96c71386faf9ebbf8962d89b92782f102: CDI devices from CRI Config.CDIDevices: []" Jan 21 00:58:20.747784 containerd[1680]: time="2026-01-21T00:58:20.747634617Z" level=info msg="CreateContainer within sandbox \"8981a065df3ff0c296c8061cf4d99c5946d106fb652bb66b69e710d26864c08f\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"df42269d3da3a55b5b1f41fd48f6fee96c71386faf9ebbf8962d89b92782f102\"" Jan 21 00:58:20.748824 containerd[1680]: time="2026-01-21T00:58:20.748668238Z" level=info msg="StartContainer for \"df42269d3da3a55b5b1f41fd48f6fee96c71386faf9ebbf8962d89b92782f102\"" Jan 21 00:58:20.750699 containerd[1680]: time="2026-01-21T00:58:20.750575990Z" level=info msg="connecting to shim df42269d3da3a55b5b1f41fd48f6fee96c71386faf9ebbf8962d89b92782f102" address="unix:///run/containerd/s/8d8d74b139111924092190e0a48d8853d1c95fbc706f7151d9830eb10c18ac48" protocol=ttrpc version=3 Jan 21 00:58:20.763919 containerd[1680]: time="2026-01-21T00:58:20.762879377Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4547-0-0-n-af1f1f5a24,Uid:691b10bb0b58b5aa64e93e5d2dcfd029,Namespace:kube-system,Attempt:0,} returns sandbox id \"2e9492921b1836aa6f8752b0782eb00f351006e704223894420a9608a3501416\"" Jan 21 00:58:20.767370 containerd[1680]: time="2026-01-21T00:58:20.767325536Z" level=info msg="CreateContainer within sandbox \"2e9492921b1836aa6f8752b0782eb00f351006e704223894420a9608a3501416\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Jan 21 00:58:20.778991 systemd[1]: Started cri-containerd-df42269d3da3a55b5b1f41fd48f6fee96c71386faf9ebbf8962d89b92782f102.scope - libcontainer container df42269d3da3a55b5b1f41fd48f6fee96c71386faf9ebbf8962d89b92782f102. Jan 21 00:58:20.782350 containerd[1680]: time="2026-01-21T00:58:20.782291845Z" level=info msg="Container 7eb23e2cf70d6a151452bd2ffd43b60b4d92ffe3bac811ce411db8da0cebd960: CDI devices from CRI Config.CDIDevices: []" Jan 21 00:58:20.785571 containerd[1680]: time="2026-01-21T00:58:20.785497187Z" level=info msg="StartContainer for \"3c06048357c65e0e42c404ea0fdfee78d76de0b1c936e855c052be812088f8af\" returns successfully" Jan 21 00:58:20.793583 containerd[1680]: time="2026-01-21T00:58:20.793031926Z" level=info msg="CreateContainer within sandbox \"2e9492921b1836aa6f8752b0782eb00f351006e704223894420a9608a3501416\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"7eb23e2cf70d6a151452bd2ffd43b60b4d92ffe3bac811ce411db8da0cebd960\"" Jan 21 00:58:20.794043 containerd[1680]: time="2026-01-21T00:58:20.793961467Z" level=info msg="StartContainer for \"7eb23e2cf70d6a151452bd2ffd43b60b4d92ffe3bac811ce411db8da0cebd960\"" Jan 21 00:58:20.795315 containerd[1680]: time="2026-01-21T00:58:20.795292974Z" level=info msg="connecting to shim 7eb23e2cf70d6a151452bd2ffd43b60b4d92ffe3bac811ce411db8da0cebd960" address="unix:///run/containerd/s/f016eb9e19f88219bbe73ae3611c579f01419a2f69cf4ae1a48acb5c18763823" protocol=ttrpc version=3 Jan 21 00:58:20.801000 audit: BPF prog-id=103 op=LOAD Jan 21 00:58:20.803000 audit: BPF prog-id=104 op=LOAD Jan 21 00:58:20.803000 audit[2725]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a0238 a2=98 a3=0 items=0 ppid=2613 pid=2725 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:20.803000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6466343232363964336461336135356235623166343166643438663666 Jan 21 00:58:20.804000 audit: BPF prog-id=104 op=UNLOAD Jan 21 00:58:20.804000 audit[2725]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2613 pid=2725 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:20.804000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6466343232363964336461336135356235623166343166643438663666 Jan 21 00:58:20.804000 audit: BPF prog-id=105 op=LOAD Jan 21 00:58:20.804000 audit[2725]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a0488 a2=98 a3=0 items=0 ppid=2613 pid=2725 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:20.804000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6466343232363964336461336135356235623166343166643438663666 Jan 21 00:58:20.804000 audit: BPF prog-id=106 op=LOAD Jan 21 00:58:20.804000 audit[2725]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c0001a0218 a2=98 a3=0 items=0 ppid=2613 pid=2725 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:20.804000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6466343232363964336461336135356235623166343166643438663666 Jan 21 00:58:20.804000 audit: BPF prog-id=106 op=UNLOAD Jan 21 00:58:20.804000 audit[2725]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2613 pid=2725 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:20.804000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6466343232363964336461336135356235623166343166643438663666 Jan 21 00:58:20.804000 audit: BPF prog-id=105 op=UNLOAD Jan 21 00:58:20.804000 audit[2725]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2613 pid=2725 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:20.804000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6466343232363964336461336135356235623166343166643438663666 Jan 21 00:58:20.804000 audit: BPF prog-id=107 op=LOAD Jan 21 00:58:20.804000 audit[2725]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a06e8 a2=98 a3=0 items=0 ppid=2613 pid=2725 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:20.804000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6466343232363964336461336135356235623166343166643438663666 Jan 21 00:58:20.821967 systemd[1]: Started cri-containerd-7eb23e2cf70d6a151452bd2ffd43b60b4d92ffe3bac811ce411db8da0cebd960.scope - libcontainer container 7eb23e2cf70d6a151452bd2ffd43b60b4d92ffe3bac811ce411db8da0cebd960. Jan 21 00:58:20.849000 audit: BPF prog-id=108 op=LOAD Jan 21 00:58:20.849000 audit: BPF prog-id=109 op=LOAD Jan 21 00:58:20.849000 audit[2756]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=2658 pid=2756 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:20.849000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3765623233653263663730643661313531343532626432666664343362 Jan 21 00:58:20.849000 audit: BPF prog-id=109 op=UNLOAD Jan 21 00:58:20.849000 audit[2756]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2658 pid=2756 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:20.849000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3765623233653263663730643661313531343532626432666664343362 Jan 21 00:58:20.850000 audit: BPF prog-id=110 op=LOAD Jan 21 00:58:20.850000 audit[2756]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=2658 pid=2756 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:20.850000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3765623233653263663730643661313531343532626432666664343362 Jan 21 00:58:20.850000 audit: BPF prog-id=111 op=LOAD Jan 21 00:58:20.850000 audit[2756]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=2658 pid=2756 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:20.850000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3765623233653263663730643661313531343532626432666664343362 Jan 21 00:58:20.850000 audit: BPF prog-id=111 op=UNLOAD Jan 21 00:58:20.850000 audit[2756]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2658 pid=2756 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:20.850000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3765623233653263663730643661313531343532626432666664343362 Jan 21 00:58:20.850000 audit: BPF prog-id=110 op=UNLOAD Jan 21 00:58:20.850000 audit[2756]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2658 pid=2756 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:20.850000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3765623233653263663730643661313531343532626432666664343362 Jan 21 00:58:20.850000 audit: BPF prog-id=112 op=LOAD Jan 21 00:58:20.850000 audit[2756]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=2658 pid=2756 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:20.850000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3765623233653263663730643661313531343532626432666664343362 Jan 21 00:58:20.857995 containerd[1680]: time="2026-01-21T00:58:20.857967900Z" level=info msg="StartContainer for \"df42269d3da3a55b5b1f41fd48f6fee96c71386faf9ebbf8962d89b92782f102\" returns successfully" Jan 21 00:58:20.900033 containerd[1680]: time="2026-01-21T00:58:20.900003568Z" level=info msg="StartContainer for \"7eb23e2cf70d6a151452bd2ffd43b60b4d92ffe3bac811ce411db8da0cebd960\" returns successfully" Jan 21 00:58:21.454711 kubelet[2516]: I0121 00:58:21.454318 2516 kubelet_node_status.go:75] "Attempting to register node" node="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:58:21.672624 kubelet[2516]: E0121 00:58:21.672602 2516 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-n-af1f1f5a24\" not found" node="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:58:21.673570 kubelet[2516]: E0121 00:58:21.673470 2516 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-n-af1f1f5a24\" not found" node="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:58:21.675898 kubelet[2516]: E0121 00:58:21.675880 2516 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-n-af1f1f5a24\" not found" node="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:58:22.495674 kubelet[2516]: E0121 00:58:22.495621 2516 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4547-0-0-n-af1f1f5a24\" not found" node="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:58:22.557471 kubelet[2516]: I0121 00:58:22.557433 2516 kubelet_node_status.go:78] "Successfully registered node" node="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:58:22.557746 kubelet[2516]: E0121 00:58:22.557615 2516 kubelet_node_status.go:548] "Error updating node status, will retry" err="error getting node \"ci-4547-0-0-n-af1f1f5a24\": node \"ci-4547-0-0-n-af1f1f5a24\" not found" Jan 21 00:58:22.573113 kubelet[2516]: E0121 00:58:22.573073 2516 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4547-0-0-n-af1f1f5a24\" not found" Jan 21 00:58:22.673772 kubelet[2516]: E0121 00:58:22.673570 2516 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4547-0-0-n-af1f1f5a24\" not found" Jan 21 00:58:22.678154 kubelet[2516]: E0121 00:58:22.678107 2516 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-n-af1f1f5a24\" not found" node="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:58:22.678425 kubelet[2516]: E0121 00:58:22.678352 2516 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-n-af1f1f5a24\" not found" node="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:58:22.774016 kubelet[2516]: E0121 00:58:22.773896 2516 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4547-0-0-n-af1f1f5a24\" not found" Jan 21 00:58:22.874892 kubelet[2516]: E0121 00:58:22.874845 2516 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4547-0-0-n-af1f1f5a24\" not found" Jan 21 00:58:22.990160 kubelet[2516]: I0121 00:58:22.989895 2516 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4547-0-0-n-af1f1f5a24" Jan 21 00:58:22.995636 kubelet[2516]: E0121 00:58:22.995607 2516 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4547-0-0-n-af1f1f5a24\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4547-0-0-n-af1f1f5a24" Jan 21 00:58:22.995636 kubelet[2516]: I0121 00:58:22.995631 2516 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4547-0-0-n-af1f1f5a24" Jan 21 00:58:22.997017 kubelet[2516]: E0121 00:58:22.996888 2516 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4547-0-0-n-af1f1f5a24\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4547-0-0-n-af1f1f5a24" Jan 21 00:58:22.997017 kubelet[2516]: I0121 00:58:22.996908 2516 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4547-0-0-n-af1f1f5a24" Jan 21 00:58:22.997751 kubelet[2516]: E0121 00:58:22.997732 2516 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4547-0-0-n-af1f1f5a24\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4547-0-0-n-af1f1f5a24" Jan 21 00:58:23.450641 kubelet[2516]: I0121 00:58:23.450459 2516 apiserver.go:52] "Watching apiserver" Jan 21 00:58:23.493127 kubelet[2516]: I0121 00:58:23.493094 2516 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Jan 21 00:58:23.679062 kubelet[2516]: I0121 00:58:23.678864 2516 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4547-0-0-n-af1f1f5a24" Jan 21 00:58:23.679062 kubelet[2516]: I0121 00:58:23.678955 2516 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4547-0-0-n-af1f1f5a24" Jan 21 00:58:24.557208 systemd[1]: Reload requested from client PID 2803 ('systemctl') (unit session-10.scope)... Jan 21 00:58:24.557562 systemd[1]: Reloading... Jan 21 00:58:24.652716 zram_generator::config[2846]: No configuration found. Jan 21 00:58:24.680864 kubelet[2516]: I0121 00:58:24.680584 2516 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4547-0-0-n-af1f1f5a24" Jan 21 00:58:24.687705 kubelet[2516]: E0121 00:58:24.687590 2516 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4547-0-0-n-af1f1f5a24\" already exists" pod="kube-system/kube-apiserver-ci-4547-0-0-n-af1f1f5a24" Jan 21 00:58:24.858386 systemd[1]: Reloading finished in 300 ms. Jan 21 00:58:24.883256 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Jan 21 00:58:24.899027 systemd[1]: kubelet.service: Deactivated successfully. Jan 21 00:58:24.899327 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 21 00:58:24.902034 kernel: kauditd_printk_skb: 202 callbacks suppressed Jan 21 00:58:24.902082 kernel: audit: type=1131 audit(1768957104.898:398): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:58:24.898000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:58:24.899400 systemd[1]: kubelet.service: Consumed 631ms CPU time, 131.9M memory peak. Jan 21 00:58:24.904260 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 21 00:58:24.904000 audit: BPF prog-id=113 op=LOAD Jan 21 00:58:24.907657 kernel: audit: type=1334 audit(1768957104.904:399): prog-id=113 op=LOAD Jan 21 00:58:24.907743 kernel: audit: type=1334 audit(1768957104.904:400): prog-id=114 op=LOAD Jan 21 00:58:24.904000 audit: BPF prog-id=114 op=LOAD Jan 21 00:58:24.904000 audit: BPF prog-id=63 op=UNLOAD Jan 21 00:58:24.910315 kernel: audit: type=1334 audit(1768957104.904:401): prog-id=63 op=UNLOAD Jan 21 00:58:24.910375 kernel: audit: type=1334 audit(1768957104.904:402): prog-id=64 op=UNLOAD Jan 21 00:58:24.904000 audit: BPF prog-id=64 op=UNLOAD Jan 21 00:58:24.911397 kernel: audit: type=1334 audit(1768957104.906:403): prog-id=115 op=LOAD Jan 21 00:58:24.906000 audit: BPF prog-id=115 op=LOAD Jan 21 00:58:24.912467 kernel: audit: type=1334 audit(1768957104.906:404): prog-id=76 op=UNLOAD Jan 21 00:58:24.906000 audit: BPF prog-id=76 op=UNLOAD Jan 21 00:58:24.914258 kernel: audit: type=1334 audit(1768957104.906:405): prog-id=116 op=LOAD Jan 21 00:58:24.906000 audit: BPF prog-id=116 op=LOAD Jan 21 00:58:24.915389 kernel: audit: type=1334 audit(1768957104.907:406): prog-id=117 op=LOAD Jan 21 00:58:24.907000 audit: BPF prog-id=117 op=LOAD Jan 21 00:58:24.916529 kernel: audit: type=1334 audit(1768957104.907:407): prog-id=77 op=UNLOAD Jan 21 00:58:24.907000 audit: BPF prog-id=77 op=UNLOAD Jan 21 00:58:24.907000 audit: BPF prog-id=78 op=UNLOAD Jan 21 00:58:24.910000 audit: BPF prog-id=118 op=LOAD Jan 21 00:58:24.910000 audit: BPF prog-id=65 op=UNLOAD Jan 21 00:58:24.912000 audit: BPF prog-id=119 op=LOAD Jan 21 00:58:24.912000 audit: BPF prog-id=70 op=UNLOAD Jan 21 00:58:24.912000 audit: BPF prog-id=120 op=LOAD Jan 21 00:58:24.912000 audit: BPF prog-id=121 op=LOAD Jan 21 00:58:24.912000 audit: BPF prog-id=71 op=UNLOAD Jan 21 00:58:24.912000 audit: BPF prog-id=72 op=UNLOAD Jan 21 00:58:24.914000 audit: BPF prog-id=122 op=LOAD Jan 21 00:58:24.914000 audit: BPF prog-id=79 op=UNLOAD Jan 21 00:58:24.914000 audit: BPF prog-id=123 op=LOAD Jan 21 00:58:24.914000 audit: BPF prog-id=124 op=LOAD Jan 21 00:58:24.914000 audit: BPF prog-id=80 op=UNLOAD Jan 21 00:58:24.914000 audit: BPF prog-id=81 op=UNLOAD Jan 21 00:58:24.915000 audit: BPF prog-id=125 op=LOAD Jan 21 00:58:24.915000 audit: BPF prog-id=67 op=UNLOAD Jan 21 00:58:24.915000 audit: BPF prog-id=126 op=LOAD Jan 21 00:58:24.915000 audit: BPF prog-id=127 op=LOAD Jan 21 00:58:24.915000 audit: BPF prog-id=68 op=UNLOAD Jan 21 00:58:24.915000 audit: BPF prog-id=69 op=UNLOAD Jan 21 00:58:24.915000 audit: BPF prog-id=128 op=LOAD Jan 21 00:58:24.925000 audit: BPF prog-id=82 op=UNLOAD Jan 21 00:58:24.926000 audit: BPF prog-id=129 op=LOAD Jan 21 00:58:24.926000 audit: BPF prog-id=66 op=UNLOAD Jan 21 00:58:24.926000 audit: BPF prog-id=130 op=LOAD Jan 21 00:58:24.926000 audit: BPF prog-id=73 op=UNLOAD Jan 21 00:58:24.926000 audit: BPF prog-id=131 op=LOAD Jan 21 00:58:24.926000 audit: BPF prog-id=132 op=LOAD Jan 21 00:58:24.926000 audit: BPF prog-id=74 op=UNLOAD Jan 21 00:58:24.926000 audit: BPF prog-id=75 op=UNLOAD Jan 21 00:58:25.047389 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 21 00:58:25.046000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:58:25.055373 (kubelet)[2901]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 21 00:58:25.154816 kubelet[2901]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 21 00:58:25.154816 kubelet[2901]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Jan 21 00:58:25.154816 kubelet[2901]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 21 00:58:25.155407 kubelet[2901]: I0121 00:58:25.154797 2901 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 21 00:58:25.160753 kubelet[2901]: I0121 00:58:25.160728 2901 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Jan 21 00:58:25.161324 kubelet[2901]: I0121 00:58:25.160900 2901 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 21 00:58:25.161519 kubelet[2901]: I0121 00:58:25.161509 2901 server.go:954] "Client rotation is on, will bootstrap in background" Jan 21 00:58:25.163667 kubelet[2901]: I0121 00:58:25.163652 2901 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Jan 21 00:58:25.166323 kubelet[2901]: I0121 00:58:25.166298 2901 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 21 00:58:25.169771 kubelet[2901]: I0121 00:58:25.169748 2901 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 21 00:58:25.172536 kubelet[2901]: I0121 00:58:25.172518 2901 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jan 21 00:58:25.172709 kubelet[2901]: I0121 00:58:25.172670 2901 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 21 00:58:25.172875 kubelet[2901]: I0121 00:58:25.172710 2901 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4547-0-0-n-af1f1f5a24","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 21 00:58:25.173000 kubelet[2901]: I0121 00:58:25.172879 2901 topology_manager.go:138] "Creating topology manager with none policy" Jan 21 00:58:25.173000 kubelet[2901]: I0121 00:58:25.172888 2901 container_manager_linux.go:304] "Creating device plugin manager" Jan 21 00:58:25.173000 kubelet[2901]: I0121 00:58:25.172929 2901 state_mem.go:36] "Initialized new in-memory state store" Jan 21 00:58:25.173073 kubelet[2901]: I0121 00:58:25.173046 2901 kubelet.go:446] "Attempting to sync node with API server" Jan 21 00:58:25.173073 kubelet[2901]: I0121 00:58:25.173062 2901 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 21 00:58:25.173569 kubelet[2901]: I0121 00:58:25.173101 2901 kubelet.go:352] "Adding apiserver pod source" Jan 21 00:58:25.173569 kubelet[2901]: I0121 00:58:25.173111 2901 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 21 00:58:25.173910 kubelet[2901]: I0121 00:58:25.173899 2901 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.1.5" apiVersion="v1" Jan 21 00:58:25.174273 kubelet[2901]: I0121 00:58:25.174264 2901 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 21 00:58:25.174668 kubelet[2901]: I0121 00:58:25.174657 2901 watchdog_linux.go:99] "Systemd watchdog is not enabled" Jan 21 00:58:25.174736 kubelet[2901]: I0121 00:58:25.174731 2901 server.go:1287] "Started kubelet" Jan 21 00:58:25.179635 kubelet[2901]: I0121 00:58:25.179620 2901 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 21 00:58:25.182927 kubelet[2901]: I0121 00:58:25.182885 2901 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Jan 21 00:58:25.184001 kubelet[2901]: I0121 00:58:25.183951 2901 server.go:479] "Adding debug handlers to kubelet server" Jan 21 00:58:25.185959 kubelet[2901]: I0121 00:58:25.185914 2901 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 21 00:58:25.186098 kubelet[2901]: I0121 00:58:25.186088 2901 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 21 00:58:25.186884 kubelet[2901]: I0121 00:58:25.186869 2901 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jan 21 00:58:25.188736 kubelet[2901]: I0121 00:58:25.188720 2901 volume_manager.go:297] "Starting Kubelet Volume Manager" Jan 21 00:58:25.189055 kubelet[2901]: E0121 00:58:25.188950 2901 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4547-0-0-n-af1f1f5a24\" not found" Jan 21 00:58:25.194166 kubelet[2901]: I0121 00:58:25.194148 2901 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Jan 21 00:58:25.194253 kubelet[2901]: I0121 00:58:25.194242 2901 reconciler.go:26] "Reconciler: start to sync state" Jan 21 00:58:25.198705 kubelet[2901]: I0121 00:58:25.197550 2901 factory.go:221] Registration of the systemd container factory successfully Jan 21 00:58:25.198705 kubelet[2901]: I0121 00:58:25.197647 2901 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 21 00:58:25.205345 kubelet[2901]: E0121 00:58:25.204289 2901 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jan 21 00:58:25.205546 kubelet[2901]: I0121 00:58:25.205531 2901 factory.go:221] Registration of the containerd container factory successfully Jan 21 00:58:25.207510 kubelet[2901]: I0121 00:58:25.207485 2901 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 21 00:58:25.209823 kubelet[2901]: I0121 00:58:25.209800 2901 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 21 00:58:25.209823 kubelet[2901]: I0121 00:58:25.209828 2901 status_manager.go:227] "Starting to sync pod status with apiserver" Jan 21 00:58:25.209903 kubelet[2901]: I0121 00:58:25.209844 2901 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Jan 21 00:58:25.209903 kubelet[2901]: I0121 00:58:25.209853 2901 kubelet.go:2382] "Starting kubelet main sync loop" Jan 21 00:58:25.209903 kubelet[2901]: E0121 00:58:25.209890 2901 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 21 00:58:25.243633 kubelet[2901]: I0121 00:58:25.243604 2901 cpu_manager.go:221] "Starting CPU manager" policy="none" Jan 21 00:58:25.243633 kubelet[2901]: I0121 00:58:25.243621 2901 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Jan 21 00:58:25.243802 kubelet[2901]: I0121 00:58:25.243649 2901 state_mem.go:36] "Initialized new in-memory state store" Jan 21 00:58:25.243825 kubelet[2901]: I0121 00:58:25.243810 2901 state_mem.go:88] "Updated default CPUSet" cpuSet="" Jan 21 00:58:25.243845 kubelet[2901]: I0121 00:58:25.243819 2901 state_mem.go:96] "Updated CPUSet assignments" assignments={} Jan 21 00:58:25.243845 kubelet[2901]: I0121 00:58:25.243835 2901 policy_none.go:49] "None policy: Start" Jan 21 00:58:25.243845 kubelet[2901]: I0121 00:58:25.243842 2901 memory_manager.go:186] "Starting memorymanager" policy="None" Jan 21 00:58:25.243917 kubelet[2901]: I0121 00:58:25.243851 2901 state_mem.go:35] "Initializing new in-memory state store" Jan 21 00:58:25.243945 kubelet[2901]: I0121 00:58:25.243936 2901 state_mem.go:75] "Updated machine memory state" Jan 21 00:58:25.246949 kubelet[2901]: I0121 00:58:25.246920 2901 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 21 00:58:25.247486 kubelet[2901]: I0121 00:58:25.247045 2901 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 21 00:58:25.247486 kubelet[2901]: I0121 00:58:25.247054 2901 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 21 00:58:25.247486 kubelet[2901]: I0121 00:58:25.247403 2901 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 21 00:58:25.251059 kubelet[2901]: E0121 00:58:25.250699 2901 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Jan 21 00:58:25.310914 kubelet[2901]: I0121 00:58:25.310885 2901 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4547-0-0-n-af1f1f5a24" Jan 21 00:58:25.311086 kubelet[2901]: I0121 00:58:25.311073 2901 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4547-0-0-n-af1f1f5a24" Jan 21 00:58:25.311310 kubelet[2901]: I0121 00:58:25.311301 2901 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4547-0-0-n-af1f1f5a24" Jan 21 00:58:25.320655 kubelet[2901]: E0121 00:58:25.320626 2901 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4547-0-0-n-af1f1f5a24\" already exists" pod="kube-system/kube-scheduler-ci-4547-0-0-n-af1f1f5a24" Jan 21 00:58:25.321086 kubelet[2901]: E0121 00:58:25.321069 2901 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4547-0-0-n-af1f1f5a24\" already exists" pod="kube-system/kube-apiserver-ci-4547-0-0-n-af1f1f5a24" Jan 21 00:58:25.351531 kubelet[2901]: I0121 00:58:25.351273 2901 kubelet_node_status.go:75] "Attempting to register node" node="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:58:25.360836 kubelet[2901]: I0121 00:58:25.360791 2901 kubelet_node_status.go:124] "Node was previously registered" node="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:58:25.360942 kubelet[2901]: I0121 00:58:25.360880 2901 kubelet_node_status.go:78] "Successfully registered node" node="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:58:25.395998 kubelet[2901]: I0121 00:58:25.395964 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/e977b4b2c6a0e29608872340f5eef491-ca-certs\") pod \"kube-apiserver-ci-4547-0-0-n-af1f1f5a24\" (UID: \"e977b4b2c6a0e29608872340f5eef491\") " pod="kube-system/kube-apiserver-ci-4547-0-0-n-af1f1f5a24" Jan 21 00:58:25.395998 kubelet[2901]: I0121 00:58:25.396001 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/e977b4b2c6a0e29608872340f5eef491-k8s-certs\") pod \"kube-apiserver-ci-4547-0-0-n-af1f1f5a24\" (UID: \"e977b4b2c6a0e29608872340f5eef491\") " pod="kube-system/kube-apiserver-ci-4547-0-0-n-af1f1f5a24" Jan 21 00:58:25.396152 kubelet[2901]: I0121 00:58:25.396023 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/e977b4b2c6a0e29608872340f5eef491-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4547-0-0-n-af1f1f5a24\" (UID: \"e977b4b2c6a0e29608872340f5eef491\") " pod="kube-system/kube-apiserver-ci-4547-0-0-n-af1f1f5a24" Jan 21 00:58:25.396152 kubelet[2901]: I0121 00:58:25.396045 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/3d02a747eede2de7b982876e4c5c8d07-flexvolume-dir\") pod \"kube-controller-manager-ci-4547-0-0-n-af1f1f5a24\" (UID: \"3d02a747eede2de7b982876e4c5c8d07\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-n-af1f1f5a24" Jan 21 00:58:25.396152 kubelet[2901]: I0121 00:58:25.396060 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/3d02a747eede2de7b982876e4c5c8d07-k8s-certs\") pod \"kube-controller-manager-ci-4547-0-0-n-af1f1f5a24\" (UID: \"3d02a747eede2de7b982876e4c5c8d07\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-n-af1f1f5a24" Jan 21 00:58:25.396152 kubelet[2901]: I0121 00:58:25.396074 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/3d02a747eede2de7b982876e4c5c8d07-kubeconfig\") pod \"kube-controller-manager-ci-4547-0-0-n-af1f1f5a24\" (UID: \"3d02a747eede2de7b982876e4c5c8d07\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-n-af1f1f5a24" Jan 21 00:58:25.396152 kubelet[2901]: I0121 00:58:25.396087 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/691b10bb0b58b5aa64e93e5d2dcfd029-kubeconfig\") pod \"kube-scheduler-ci-4547-0-0-n-af1f1f5a24\" (UID: \"691b10bb0b58b5aa64e93e5d2dcfd029\") " pod="kube-system/kube-scheduler-ci-4547-0-0-n-af1f1f5a24" Jan 21 00:58:25.396259 kubelet[2901]: I0121 00:58:25.396101 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/3d02a747eede2de7b982876e4c5c8d07-ca-certs\") pod \"kube-controller-manager-ci-4547-0-0-n-af1f1f5a24\" (UID: \"3d02a747eede2de7b982876e4c5c8d07\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-n-af1f1f5a24" Jan 21 00:58:25.396259 kubelet[2901]: I0121 00:58:25.396117 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/3d02a747eede2de7b982876e4c5c8d07-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4547-0-0-n-af1f1f5a24\" (UID: \"3d02a747eede2de7b982876e4c5c8d07\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-n-af1f1f5a24" Jan 21 00:58:26.178725 kubelet[2901]: I0121 00:58:26.178364 2901 apiserver.go:52] "Watching apiserver" Jan 21 00:58:26.194711 kubelet[2901]: I0121 00:58:26.194646 2901 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Jan 21 00:58:26.223863 kubelet[2901]: I0121 00:58:26.223544 2901 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4547-0-0-n-af1f1f5a24" Jan 21 00:58:26.223863 kubelet[2901]: I0121 00:58:26.223787 2901 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4547-0-0-n-af1f1f5a24" Jan 21 00:58:26.231294 kubelet[2901]: E0121 00:58:26.231244 2901 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4547-0-0-n-af1f1f5a24\" already exists" pod="kube-system/kube-scheduler-ci-4547-0-0-n-af1f1f5a24" Jan 21 00:58:26.231911 kubelet[2901]: E0121 00:58:26.231769 2901 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4547-0-0-n-af1f1f5a24\" already exists" pod="kube-system/kube-apiserver-ci-4547-0-0-n-af1f1f5a24" Jan 21 00:58:26.248328 kubelet[2901]: I0121 00:58:26.248033 2901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4547-0-0-n-af1f1f5a24" podStartSLOduration=1.248016654 podStartE2EDuration="1.248016654s" podCreationTimestamp="2026-01-21 00:58:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:58:26.247880964 +0000 UTC m=+1.188496578" watchObservedRunningTime="2026-01-21 00:58:26.248016654 +0000 UTC m=+1.188632262" Jan 21 00:58:26.248564 kubelet[2901]: I0121 00:58:26.248540 2901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4547-0-0-n-af1f1f5a24" podStartSLOduration=3.248530509 podStartE2EDuration="3.248530509s" podCreationTimestamp="2026-01-21 00:58:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:58:26.240041532 +0000 UTC m=+1.180657149" watchObservedRunningTime="2026-01-21 00:58:26.248530509 +0000 UTC m=+1.189146123" Jan 21 00:58:26.257174 kubelet[2901]: I0121 00:58:26.256273 2901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4547-0-0-n-af1f1f5a24" podStartSLOduration=3.25625047 podStartE2EDuration="3.25625047s" podCreationTimestamp="2026-01-21 00:58:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:58:26.256187528 +0000 UTC m=+1.196803125" watchObservedRunningTime="2026-01-21 00:58:26.25625047 +0000 UTC m=+1.196866083" Jan 21 00:58:30.005497 kubelet[2901]: I0121 00:58:30.005461 2901 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Jan 21 00:58:30.006438 containerd[1680]: time="2026-01-21T00:58:30.006398658Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Jan 21 00:58:30.006901 kubelet[2901]: I0121 00:58:30.006562 2901 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Jan 21 00:58:30.651256 systemd[1]: Created slice kubepods-besteffort-pod03079e19_18b9_4e8b_93e3_eade210016e7.slice - libcontainer container kubepods-besteffort-pod03079e19_18b9_4e8b_93e3_eade210016e7.slice. Jan 21 00:58:30.732132 kubelet[2901]: I0121 00:58:30.732022 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/03079e19-18b9-4e8b-93e3-eade210016e7-kube-proxy\") pod \"kube-proxy-rjj78\" (UID: \"03079e19-18b9-4e8b-93e3-eade210016e7\") " pod="kube-system/kube-proxy-rjj78" Jan 21 00:58:30.732132 kubelet[2901]: I0121 00:58:30.732054 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/03079e19-18b9-4e8b-93e3-eade210016e7-xtables-lock\") pod \"kube-proxy-rjj78\" (UID: \"03079e19-18b9-4e8b-93e3-eade210016e7\") " pod="kube-system/kube-proxy-rjj78" Jan 21 00:58:30.732132 kubelet[2901]: I0121 00:58:30.732073 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/03079e19-18b9-4e8b-93e3-eade210016e7-lib-modules\") pod \"kube-proxy-rjj78\" (UID: \"03079e19-18b9-4e8b-93e3-eade210016e7\") " pod="kube-system/kube-proxy-rjj78" Jan 21 00:58:30.732132 kubelet[2901]: I0121 00:58:30.732089 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crzwj\" (UniqueName: \"kubernetes.io/projected/03079e19-18b9-4e8b-93e3-eade210016e7-kube-api-access-crzwj\") pod \"kube-proxy-rjj78\" (UID: \"03079e19-18b9-4e8b-93e3-eade210016e7\") " pod="kube-system/kube-proxy-rjj78" Jan 21 00:58:30.838936 kubelet[2901]: E0121 00:58:30.838906 2901 projected.go:288] Couldn't get configMap kube-system/kube-root-ca.crt: configmap "kube-root-ca.crt" not found Jan 21 00:58:30.838936 kubelet[2901]: E0121 00:58:30.838934 2901 projected.go:194] Error preparing data for projected volume kube-api-access-crzwj for pod kube-system/kube-proxy-rjj78: configmap "kube-root-ca.crt" not found Jan 21 00:58:30.839147 kubelet[2901]: E0121 00:58:30.838994 2901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/03079e19-18b9-4e8b-93e3-eade210016e7-kube-api-access-crzwj podName:03079e19-18b9-4e8b-93e3-eade210016e7 nodeName:}" failed. No retries permitted until 2026-01-21 00:58:31.338972946 +0000 UTC m=+6.279588544 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-crzwj" (UniqueName: "kubernetes.io/projected/03079e19-18b9-4e8b-93e3-eade210016e7-kube-api-access-crzwj") pod "kube-proxy-rjj78" (UID: "03079e19-18b9-4e8b-93e3-eade210016e7") : configmap "kube-root-ca.crt" not found Jan 21 00:58:31.073061 systemd[1]: Created slice kubepods-besteffort-poda5089f61_d17c_42c3_bc87_ead8c34f0198.slice - libcontainer container kubepods-besteffort-poda5089f61_d17c_42c3_bc87_ead8c34f0198.slice. Jan 21 00:58:31.133776 kubelet[2901]: I0121 00:58:31.133657 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/a5089f61-d17c-42c3-bc87-ead8c34f0198-var-lib-calico\") pod \"tigera-operator-7dcd859c48-fvzpt\" (UID: \"a5089f61-d17c-42c3-bc87-ead8c34f0198\") " pod="tigera-operator/tigera-operator-7dcd859c48-fvzpt" Jan 21 00:58:31.133776 kubelet[2901]: I0121 00:58:31.133728 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5fbq\" (UniqueName: \"kubernetes.io/projected/a5089f61-d17c-42c3-bc87-ead8c34f0198-kube-api-access-w5fbq\") pod \"tigera-operator-7dcd859c48-fvzpt\" (UID: \"a5089f61-d17c-42c3-bc87-ead8c34f0198\") " pod="tigera-operator/tigera-operator-7dcd859c48-fvzpt" Jan 21 00:58:31.378878 containerd[1680]: time="2026-01-21T00:58:31.378595155Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-fvzpt,Uid:a5089f61-d17c-42c3-bc87-ead8c34f0198,Namespace:tigera-operator,Attempt:0,}" Jan 21 00:58:31.405057 containerd[1680]: time="2026-01-21T00:58:31.405000530Z" level=info msg="connecting to shim 450c34364066109978fe09fa992ad6c2c4ecf711d9656e7efca042de0425d296" address="unix:///run/containerd/s/3f6435668d8f6faeb8dddb52e46ca1cadaaecc6349796259469e8bb9996dd34d" namespace=k8s.io protocol=ttrpc version=3 Jan 21 00:58:31.432950 systemd[1]: Started cri-containerd-450c34364066109978fe09fa992ad6c2c4ecf711d9656e7efca042de0425d296.scope - libcontainer container 450c34364066109978fe09fa992ad6c2c4ecf711d9656e7efca042de0425d296. Jan 21 00:58:31.447000 audit: BPF prog-id=133 op=LOAD Jan 21 00:58:31.448859 kernel: kauditd_printk_skb: 32 callbacks suppressed Jan 21 00:58:31.448901 kernel: audit: type=1334 audit(1768957111.447:440): prog-id=133 op=LOAD Jan 21 00:58:31.451017 kernel: audit: type=1334 audit(1768957111.449:441): prog-id=134 op=LOAD Jan 21 00:58:31.449000 audit: BPF prog-id=134 op=LOAD Jan 21 00:58:31.449000 audit[2966]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000106238 a2=98 a3=0 items=0 ppid=2954 pid=2966 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:31.453167 kernel: audit: type=1300 audit(1768957111.449:441): arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000106238 a2=98 a3=0 items=0 ppid=2954 pid=2966 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:31.449000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3435306333343336343036363130393937386665303966613939326164 Jan 21 00:58:31.457058 kernel: audit: type=1327 audit(1768957111.449:441): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3435306333343336343036363130393937386665303966613939326164 Jan 21 00:58:31.449000 audit: BPF prog-id=134 op=UNLOAD Jan 21 00:58:31.460716 kernel: audit: type=1334 audit(1768957111.449:442): prog-id=134 op=UNLOAD Jan 21 00:58:31.460762 kernel: audit: type=1300 audit(1768957111.449:442): arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2954 pid=2966 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:31.449000 audit[2966]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2954 pid=2966 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:31.449000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3435306333343336343036363130393937386665303966613939326164 Jan 21 00:58:31.465510 kernel: audit: type=1327 audit(1768957111.449:442): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3435306333343336343036363130393937386665303966613939326164 Jan 21 00:58:31.450000 audit: BPF prog-id=135 op=LOAD Jan 21 00:58:31.468895 kernel: audit: type=1334 audit(1768957111.450:443): prog-id=135 op=LOAD Jan 21 00:58:31.450000 audit[2966]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000106488 a2=98 a3=0 items=0 ppid=2954 pid=2966 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:31.470171 kernel: audit: type=1300 audit(1768957111.450:443): arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000106488 a2=98 a3=0 items=0 ppid=2954 pid=2966 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:31.450000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3435306333343336343036363130393937386665303966613939326164 Jan 21 00:58:31.451000 audit: BPF prog-id=136 op=LOAD Jan 21 00:58:31.451000 audit[2966]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000106218 a2=98 a3=0 items=0 ppid=2954 pid=2966 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:31.451000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3435306333343336343036363130393937386665303966613939326164 Jan 21 00:58:31.451000 audit: BPF prog-id=136 op=UNLOAD Jan 21 00:58:31.451000 audit[2966]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=2954 pid=2966 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:31.451000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3435306333343336343036363130393937386665303966613939326164 Jan 21 00:58:31.451000 audit: BPF prog-id=135 op=UNLOAD Jan 21 00:58:31.451000 audit[2966]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2954 pid=2966 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:31.451000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3435306333343336343036363130393937386665303966613939326164 Jan 21 00:58:31.476738 kernel: audit: type=1327 audit(1768957111.450:443): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3435306333343336343036363130393937386665303966613939326164 Jan 21 00:58:31.451000 audit: BPF prog-id=137 op=LOAD Jan 21 00:58:31.451000 audit[2966]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001066e8 a2=98 a3=0 items=0 ppid=2954 pid=2966 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:31.451000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3435306333343336343036363130393937386665303966613939326164 Jan 21 00:58:31.498781 containerd[1680]: time="2026-01-21T00:58:31.498727826Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-fvzpt,Uid:a5089f61-d17c-42c3-bc87-ead8c34f0198,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"450c34364066109978fe09fa992ad6c2c4ecf711d9656e7efca042de0425d296\"" Jan 21 00:58:31.500440 containerd[1680]: time="2026-01-21T00:58:31.500306894Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\"" Jan 21 00:58:31.561447 containerd[1680]: time="2026-01-21T00:58:31.561407862Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-rjj78,Uid:03079e19-18b9-4e8b-93e3-eade210016e7,Namespace:kube-system,Attempt:0,}" Jan 21 00:58:31.587332 containerd[1680]: time="2026-01-21T00:58:31.587270635Z" level=info msg="connecting to shim 78abff6f899f4e3d12733f142deba525656de2af82c9eb377574d0bf8a92292a" address="unix:///run/containerd/s/c999db3f5f4ce0f8789f8e3cc1fcda5577b544cc15f41f4c245e7150a7961b68" namespace=k8s.io protocol=ttrpc version=3 Jan 21 00:58:31.608932 systemd[1]: Started cri-containerd-78abff6f899f4e3d12733f142deba525656de2af82c9eb377574d0bf8a92292a.scope - libcontainer container 78abff6f899f4e3d12733f142deba525656de2af82c9eb377574d0bf8a92292a. Jan 21 00:58:31.617000 audit: BPF prog-id=138 op=LOAD Jan 21 00:58:31.618000 audit: BPF prog-id=139 op=LOAD Jan 21 00:58:31.618000 audit[3012]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000138238 a2=98 a3=0 items=0 ppid=3001 pid=3012 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:31.618000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3738616266663666383939663465336431323733336631343264656261 Jan 21 00:58:31.618000 audit: BPF prog-id=139 op=UNLOAD Jan 21 00:58:31.618000 audit[3012]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3001 pid=3012 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:31.618000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3738616266663666383939663465336431323733336631343264656261 Jan 21 00:58:31.618000 audit: BPF prog-id=140 op=LOAD Jan 21 00:58:31.618000 audit[3012]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000138488 a2=98 a3=0 items=0 ppid=3001 pid=3012 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:31.618000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3738616266663666383939663465336431323733336631343264656261 Jan 21 00:58:31.618000 audit: BPF prog-id=141 op=LOAD Jan 21 00:58:31.618000 audit[3012]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000138218 a2=98 a3=0 items=0 ppid=3001 pid=3012 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:31.618000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3738616266663666383939663465336431323733336631343264656261 Jan 21 00:58:31.618000 audit: BPF prog-id=141 op=UNLOAD Jan 21 00:58:31.618000 audit[3012]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3001 pid=3012 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:31.618000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3738616266663666383939663465336431323733336631343264656261 Jan 21 00:58:31.618000 audit: BPF prog-id=140 op=UNLOAD Jan 21 00:58:31.618000 audit[3012]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3001 pid=3012 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:31.618000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3738616266663666383939663465336431323733336631343264656261 Jan 21 00:58:31.618000 audit: BPF prog-id=142 op=LOAD Jan 21 00:58:31.618000 audit[3012]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001386e8 a2=98 a3=0 items=0 ppid=3001 pid=3012 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:31.618000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3738616266663666383939663465336431323733336631343264656261 Jan 21 00:58:31.633018 containerd[1680]: time="2026-01-21T00:58:31.632885236Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-rjj78,Uid:03079e19-18b9-4e8b-93e3-eade210016e7,Namespace:kube-system,Attempt:0,} returns sandbox id \"78abff6f899f4e3d12733f142deba525656de2af82c9eb377574d0bf8a92292a\"" Jan 21 00:58:31.636579 containerd[1680]: time="2026-01-21T00:58:31.636552379Z" level=info msg="CreateContainer within sandbox \"78abff6f899f4e3d12733f142deba525656de2af82c9eb377574d0bf8a92292a\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Jan 21 00:58:31.648248 containerd[1680]: time="2026-01-21T00:58:31.648204199Z" level=info msg="Container 72fe243d50c40ecd25cf3aa877df4ccdf150010e68652eb46f6dffab2f36be12: CDI devices from CRI Config.CDIDevices: []" Jan 21 00:58:31.658806 containerd[1680]: time="2026-01-21T00:58:31.658771002Z" level=info msg="CreateContainer within sandbox \"78abff6f899f4e3d12733f142deba525656de2af82c9eb377574d0bf8a92292a\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"72fe243d50c40ecd25cf3aa877df4ccdf150010e68652eb46f6dffab2f36be12\"" Jan 21 00:58:31.659237 containerd[1680]: time="2026-01-21T00:58:31.659220771Z" level=info msg="StartContainer for \"72fe243d50c40ecd25cf3aa877df4ccdf150010e68652eb46f6dffab2f36be12\"" Jan 21 00:58:31.661816 containerd[1680]: time="2026-01-21T00:58:31.661599605Z" level=info msg="connecting to shim 72fe243d50c40ecd25cf3aa877df4ccdf150010e68652eb46f6dffab2f36be12" address="unix:///run/containerd/s/c999db3f5f4ce0f8789f8e3cc1fcda5577b544cc15f41f4c245e7150a7961b68" protocol=ttrpc version=3 Jan 21 00:58:31.680911 systemd[1]: Started cri-containerd-72fe243d50c40ecd25cf3aa877df4ccdf150010e68652eb46f6dffab2f36be12.scope - libcontainer container 72fe243d50c40ecd25cf3aa877df4ccdf150010e68652eb46f6dffab2f36be12. Jan 21 00:58:31.723000 audit: BPF prog-id=143 op=LOAD Jan 21 00:58:31.723000 audit[3044]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=3001 pid=3044 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:31.723000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3732666532343364353063343065636432356366336161383737646634 Jan 21 00:58:31.723000 audit: BPF prog-id=144 op=LOAD Jan 21 00:58:31.723000 audit[3044]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=3001 pid=3044 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:31.723000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3732666532343364353063343065636432356366336161383737646634 Jan 21 00:58:31.723000 audit: BPF prog-id=144 op=UNLOAD Jan 21 00:58:31.723000 audit[3044]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3001 pid=3044 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:31.723000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3732666532343364353063343065636432356366336161383737646634 Jan 21 00:58:31.723000 audit: BPF prog-id=143 op=UNLOAD Jan 21 00:58:31.723000 audit[3044]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3001 pid=3044 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:31.723000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3732666532343364353063343065636432356366336161383737646634 Jan 21 00:58:31.723000 audit: BPF prog-id=145 op=LOAD Jan 21 00:58:31.723000 audit[3044]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=3001 pid=3044 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:31.723000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3732666532343364353063343065636432356366336161383737646634 Jan 21 00:58:31.743981 containerd[1680]: time="2026-01-21T00:58:31.743932740Z" level=info msg="StartContainer for \"72fe243d50c40ecd25cf3aa877df4ccdf150010e68652eb46f6dffab2f36be12\" returns successfully" Jan 21 00:58:31.865000 audit[3107]: NETFILTER_CFG table=mangle:54 family=2 entries=1 op=nft_register_chain pid=3107 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 21 00:58:31.865000 audit[3107]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffefcca4c90 a2=0 a3=7ffefcca4c7c items=0 ppid=3056 pid=3107 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:31.865000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Jan 21 00:58:31.868000 audit[3108]: NETFILTER_CFG table=mangle:55 family=10 entries=1 op=nft_register_chain pid=3108 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 21 00:58:31.868000 audit[3108]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7fffdba4a270 a2=0 a3=7fffdba4a25c items=0 ppid=3056 pid=3108 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:31.868000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Jan 21 00:58:31.869000 audit[3110]: NETFILTER_CFG table=nat:56 family=10 entries=1 op=nft_register_chain pid=3110 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 21 00:58:31.869000 audit[3110]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffd398c7050 a2=0 a3=7ffd398c703c items=0 ppid=3056 pid=3110 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:31.869000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Jan 21 00:58:31.870000 audit[3109]: NETFILTER_CFG table=nat:57 family=2 entries=1 op=nft_register_chain pid=3109 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 21 00:58:31.870000 audit[3109]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffefd71d070 a2=0 a3=7ffefd71d05c items=0 ppid=3056 pid=3109 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:31.870000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Jan 21 00:58:31.872000 audit[3112]: NETFILTER_CFG table=filter:58 family=2 entries=1 op=nft_register_chain pid=3112 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 21 00:58:31.872000 audit[3112]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffe3989bf20 a2=0 a3=7ffe3989bf0c items=0 ppid=3056 pid=3112 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:31.872000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Jan 21 00:58:31.872000 audit[3113]: NETFILTER_CFG table=filter:59 family=10 entries=1 op=nft_register_chain pid=3113 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 21 00:58:31.872000 audit[3113]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffe0a207450 a2=0 a3=7ffe0a20743c items=0 ppid=3056 pid=3113 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:31.872000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Jan 21 00:58:31.975000 audit[3114]: NETFILTER_CFG table=filter:60 family=2 entries=1 op=nft_register_chain pid=3114 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 21 00:58:31.975000 audit[3114]: SYSCALL arch=c000003e syscall=46 success=yes exit=108 a0=3 a1=7fff128996d0 a2=0 a3=7fff128996bc items=0 ppid=3056 pid=3114 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:31.975000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Jan 21 00:58:31.978000 audit[3116]: NETFILTER_CFG table=filter:61 family=2 entries=1 op=nft_register_rule pid=3116 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 21 00:58:31.978000 audit[3116]: SYSCALL arch=c000003e syscall=46 success=yes exit=752 a0=3 a1=7ffc423908c0 a2=0 a3=7ffc423908ac items=0 ppid=3056 pid=3116 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:31.978000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C652073657276696365 Jan 21 00:58:31.982000 audit[3119]: NETFILTER_CFG table=filter:62 family=2 entries=1 op=nft_register_rule pid=3119 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 21 00:58:31.982000 audit[3119]: SYSCALL arch=c000003e syscall=46 success=yes exit=752 a0=3 a1=7ffd3cd7f9b0 a2=0 a3=7ffd3cd7f99c items=0 ppid=3056 pid=3119 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:31.982000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C65207365727669 Jan 21 00:58:31.983000 audit[3120]: NETFILTER_CFG table=filter:63 family=2 entries=1 op=nft_register_chain pid=3120 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 21 00:58:31.983000 audit[3120]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffc159c5b60 a2=0 a3=7ffc159c5b4c items=0 ppid=3056 pid=3120 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:31.983000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Jan 21 00:58:31.986000 audit[3122]: NETFILTER_CFG table=filter:64 family=2 entries=1 op=nft_register_rule pid=3122 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 21 00:58:31.986000 audit[3122]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffc336a2c30 a2=0 a3=7ffc336a2c1c items=0 ppid=3056 pid=3122 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:31.986000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Jan 21 00:58:31.987000 audit[3123]: NETFILTER_CFG table=filter:65 family=2 entries=1 op=nft_register_chain pid=3123 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 21 00:58:31.987000 audit[3123]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fffbf682b30 a2=0 a3=7fffbf682b1c items=0 ppid=3056 pid=3123 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:31.987000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D740066696C746572 Jan 21 00:58:31.990000 audit[3125]: NETFILTER_CFG table=filter:66 family=2 entries=1 op=nft_register_rule pid=3125 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 21 00:58:31.990000 audit[3125]: SYSCALL arch=c000003e syscall=46 success=yes exit=744 a0=3 a1=7fff902bfb20 a2=0 a3=7fff902bfb0c items=0 ppid=3056 pid=3125 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:31.990000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D Jan 21 00:58:31.993000 audit[3128]: NETFILTER_CFG table=filter:67 family=2 entries=1 op=nft_register_rule pid=3128 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 21 00:58:31.993000 audit[3128]: SYSCALL arch=c000003e syscall=46 success=yes exit=744 a0=3 a1=7ffdfb16f270 a2=0 a3=7ffdfb16f25c items=0 ppid=3056 pid=3128 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:31.993000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D53 Jan 21 00:58:31.994000 audit[3129]: NETFILTER_CFG table=filter:68 family=2 entries=1 op=nft_register_chain pid=3129 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 21 00:58:31.994000 audit[3129]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffcb0ad0530 a2=0 a3=7ffcb0ad051c items=0 ppid=3056 pid=3129 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:31.994000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D464F5257415244002D740066696C746572 Jan 21 00:58:31.997000 audit[3131]: NETFILTER_CFG table=filter:69 family=2 entries=1 op=nft_register_rule pid=3131 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 21 00:58:31.997000 audit[3131]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7fff3b992c40 a2=0 a3=7fff3b992c2c items=0 ppid=3056 pid=3131 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:31.997000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Jan 21 00:58:31.998000 audit[3132]: NETFILTER_CFG table=filter:70 family=2 entries=1 op=nft_register_chain pid=3132 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 21 00:58:31.998000 audit[3132]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffcef3b0fe0 a2=0 a3=7ffcef3b0fcc items=0 ppid=3056 pid=3132 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:31.998000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Jan 21 00:58:32.001000 audit[3134]: NETFILTER_CFG table=filter:71 family=2 entries=1 op=nft_register_rule pid=3134 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 21 00:58:32.001000 audit[3134]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffcc8509230 a2=0 a3=7ffcc850921c items=0 ppid=3056 pid=3134 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:32.001000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Jan 21 00:58:32.004000 audit[3137]: NETFILTER_CFG table=filter:72 family=2 entries=1 op=nft_register_rule pid=3137 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 21 00:58:32.004000 audit[3137]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffd822b1d80 a2=0 a3=7ffd822b1d6c items=0 ppid=3056 pid=3137 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:32.004000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Jan 21 00:58:32.007000 audit[3140]: NETFILTER_CFG table=filter:73 family=2 entries=1 op=nft_register_rule pid=3140 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 21 00:58:32.007000 audit[3140]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffff4c5e360 a2=0 a3=7ffff4c5e34c items=0 ppid=3056 pid=3140 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:32.007000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D Jan 21 00:58:32.008000 audit[3141]: NETFILTER_CFG table=nat:74 family=2 entries=1 op=nft_register_chain pid=3141 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 21 00:58:32.008000 audit[3141]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffe1e4ed660 a2=0 a3=7ffe1e4ed64c items=0 ppid=3056 pid=3141 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:32.008000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D74006E6174 Jan 21 00:58:32.011000 audit[3143]: NETFILTER_CFG table=nat:75 family=2 entries=1 op=nft_register_rule pid=3143 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 21 00:58:32.011000 audit[3143]: SYSCALL arch=c000003e syscall=46 success=yes exit=524 a0=3 a1=7fff3713a0e0 a2=0 a3=7fff3713a0cc items=0 ppid=3056 pid=3143 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:32.011000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 21 00:58:32.015000 audit[3146]: NETFILTER_CFG table=nat:76 family=2 entries=1 op=nft_register_rule pid=3146 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 21 00:58:32.015000 audit[3146]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffdc5d29750 a2=0 a3=7ffdc5d2973c items=0 ppid=3056 pid=3146 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:32.015000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 21 00:58:32.016000 audit[3147]: NETFILTER_CFG table=nat:77 family=2 entries=1 op=nft_register_chain pid=3147 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 21 00:58:32.016000 audit[3147]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fffd5ed6340 a2=0 a3=7fffd5ed632c items=0 ppid=3056 pid=3147 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:32.016000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Jan 21 00:58:32.018000 audit[3149]: NETFILTER_CFG table=nat:78 family=2 entries=1 op=nft_register_rule pid=3149 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 21 00:58:32.018000 audit[3149]: SYSCALL arch=c000003e syscall=46 success=yes exit=532 a0=3 a1=7ffed724bbc0 a2=0 a3=7ffed724bbac items=0 ppid=3056 pid=3149 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:32.018000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Jan 21 00:58:32.044000 audit[3155]: NETFILTER_CFG table=filter:79 family=2 entries=8 op=nft_register_rule pid=3155 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 21 00:58:32.044000 audit[3155]: SYSCALL arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7ffede79c300 a2=0 a3=7ffede79c2ec items=0 ppid=3056 pid=3155 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:32.044000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 21 00:58:32.052000 audit[3155]: NETFILTER_CFG table=nat:80 family=2 entries=14 op=nft_register_chain pid=3155 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 21 00:58:32.052000 audit[3155]: SYSCALL arch=c000003e syscall=46 success=yes exit=5508 a0=3 a1=7ffede79c300 a2=0 a3=7ffede79c2ec items=0 ppid=3056 pid=3155 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:32.052000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 21 00:58:32.053000 audit[3160]: NETFILTER_CFG table=filter:81 family=10 entries=1 op=nft_register_chain pid=3160 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 21 00:58:32.053000 audit[3160]: SYSCALL arch=c000003e syscall=46 success=yes exit=108 a0=3 a1=7fffa809bef0 a2=0 a3=7fffa809bedc items=0 ppid=3056 pid=3160 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:32.053000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Jan 21 00:58:32.056000 audit[3162]: NETFILTER_CFG table=filter:82 family=10 entries=2 op=nft_register_chain pid=3162 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 21 00:58:32.056000 audit[3162]: SYSCALL arch=c000003e syscall=46 success=yes exit=836 a0=3 a1=7ffe946094e0 a2=0 a3=7ffe946094cc items=0 ppid=3056 pid=3162 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:32.056000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C6520736572766963 Jan 21 00:58:32.060000 audit[3165]: NETFILTER_CFG table=filter:83 family=10 entries=1 op=nft_register_rule pid=3165 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 21 00:58:32.060000 audit[3165]: SYSCALL arch=c000003e syscall=46 success=yes exit=752 a0=3 a1=7ffc6d324b80 a2=0 a3=7ffc6d324b6c items=0 ppid=3056 pid=3165 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:32.060000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C652073657276 Jan 21 00:58:32.061000 audit[3166]: NETFILTER_CFG table=filter:84 family=10 entries=1 op=nft_register_chain pid=3166 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 21 00:58:32.061000 audit[3166]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffec44c4150 a2=0 a3=7ffec44c413c items=0 ppid=3056 pid=3166 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:32.061000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Jan 21 00:58:32.063000 audit[3168]: NETFILTER_CFG table=filter:85 family=10 entries=1 op=nft_register_rule pid=3168 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 21 00:58:32.063000 audit[3168]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffd67b952c0 a2=0 a3=7ffd67b952ac items=0 ppid=3056 pid=3168 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:32.063000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Jan 21 00:58:32.064000 audit[3169]: NETFILTER_CFG table=filter:86 family=10 entries=1 op=nft_register_chain pid=3169 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 21 00:58:32.064000 audit[3169]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffdc7eaead0 a2=0 a3=7ffdc7eaeabc items=0 ppid=3056 pid=3169 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:32.064000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D740066696C746572 Jan 21 00:58:32.067000 audit[3171]: NETFILTER_CFG table=filter:87 family=10 entries=1 op=nft_register_rule pid=3171 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 21 00:58:32.067000 audit[3171]: SYSCALL arch=c000003e syscall=46 success=yes exit=744 a0=3 a1=7ffd20884440 a2=0 a3=7ffd2088442c items=0 ppid=3056 pid=3171 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:32.067000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B554245 Jan 21 00:58:32.071000 audit[3174]: NETFILTER_CFG table=filter:88 family=10 entries=2 op=nft_register_chain pid=3174 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 21 00:58:32.071000 audit[3174]: SYSCALL arch=c000003e syscall=46 success=yes exit=828 a0=3 a1=7ffc034f7630 a2=0 a3=7ffc034f761c items=0 ppid=3056 pid=3174 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:32.071000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D Jan 21 00:58:32.073000 audit[3175]: NETFILTER_CFG table=filter:89 family=10 entries=1 op=nft_register_chain pid=3175 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 21 00:58:32.073000 audit[3175]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fffe157f920 a2=0 a3=7fffe157f90c items=0 ppid=3056 pid=3175 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:32.073000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D464F5257415244002D740066696C746572 Jan 21 00:58:32.076000 audit[3177]: NETFILTER_CFG table=filter:90 family=10 entries=1 op=nft_register_rule pid=3177 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 21 00:58:32.076000 audit[3177]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffcc2840350 a2=0 a3=7ffcc284033c items=0 ppid=3056 pid=3177 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:32.076000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Jan 21 00:58:32.077000 audit[3178]: NETFILTER_CFG table=filter:91 family=10 entries=1 op=nft_register_chain pid=3178 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 21 00:58:32.077000 audit[3178]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffccde30a70 a2=0 a3=7ffccde30a5c items=0 ppid=3056 pid=3178 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:32.077000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Jan 21 00:58:32.079000 audit[3180]: NETFILTER_CFG table=filter:92 family=10 entries=1 op=nft_register_rule pid=3180 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 21 00:58:32.079000 audit[3180]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffde0eee6d0 a2=0 a3=7ffde0eee6bc items=0 ppid=3056 pid=3180 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:32.079000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Jan 21 00:58:32.083000 audit[3183]: NETFILTER_CFG table=filter:93 family=10 entries=1 op=nft_register_rule pid=3183 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 21 00:58:32.083000 audit[3183]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffd152de030 a2=0 a3=7ffd152de01c items=0 ppid=3056 pid=3183 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:32.083000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D Jan 21 00:58:32.087000 audit[3186]: NETFILTER_CFG table=filter:94 family=10 entries=1 op=nft_register_rule pid=3186 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 21 00:58:32.087000 audit[3186]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffca4ed3680 a2=0 a3=7ffca4ed366c items=0 ppid=3056 pid=3186 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:32.087000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C Jan 21 00:58:32.088000 audit[3187]: NETFILTER_CFG table=nat:95 family=10 entries=1 op=nft_register_chain pid=3187 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 21 00:58:32.088000 audit[3187]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffede860270 a2=0 a3=7ffede86025c items=0 ppid=3056 pid=3187 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:32.088000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D74006E6174 Jan 21 00:58:32.090000 audit[3189]: NETFILTER_CFG table=nat:96 family=10 entries=1 op=nft_register_rule pid=3189 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 21 00:58:32.090000 audit[3189]: SYSCALL arch=c000003e syscall=46 success=yes exit=524 a0=3 a1=7fff5e39bcf0 a2=0 a3=7fff5e39bcdc items=0 ppid=3056 pid=3189 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:32.090000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 21 00:58:32.093000 audit[3192]: NETFILTER_CFG table=nat:97 family=10 entries=1 op=nft_register_rule pid=3192 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 21 00:58:32.093000 audit[3192]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffcd1bc7cd0 a2=0 a3=7ffcd1bc7cbc items=0 ppid=3056 pid=3192 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:32.093000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 21 00:58:32.094000 audit[3193]: NETFILTER_CFG table=nat:98 family=10 entries=1 op=nft_register_chain pid=3193 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 21 00:58:32.094000 audit[3193]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffe837243d0 a2=0 a3=7ffe837243bc items=0 ppid=3056 pid=3193 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:32.094000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Jan 21 00:58:32.098000 audit[3195]: NETFILTER_CFG table=nat:99 family=10 entries=2 op=nft_register_chain pid=3195 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 21 00:58:32.098000 audit[3195]: SYSCALL arch=c000003e syscall=46 success=yes exit=612 a0=3 a1=7ffdd92a57b0 a2=0 a3=7ffdd92a579c items=0 ppid=3056 pid=3195 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:32.098000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Jan 21 00:58:32.099000 audit[3196]: NETFILTER_CFG table=filter:100 family=10 entries=1 op=nft_register_chain pid=3196 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 21 00:58:32.099000 audit[3196]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffe74030c40 a2=0 a3=7ffe74030c2c items=0 ppid=3056 pid=3196 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:32.099000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4649524557414C4C002D740066696C746572 Jan 21 00:58:32.101000 audit[3198]: NETFILTER_CFG table=filter:101 family=10 entries=1 op=nft_register_rule pid=3198 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 21 00:58:32.101000 audit[3198]: SYSCALL arch=c000003e syscall=46 success=yes exit=228 a0=3 a1=7ffecabf5830 a2=0 a3=7ffecabf581c items=0 ppid=3056 pid=3198 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:32.101000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 21 00:58:32.104000 audit[3201]: NETFILTER_CFG table=filter:102 family=10 entries=1 op=nft_register_rule pid=3201 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 21 00:58:32.104000 audit[3201]: SYSCALL arch=c000003e syscall=46 success=yes exit=228 a0=3 a1=7fffe2e87d50 a2=0 a3=7fffe2e87d3c items=0 ppid=3056 pid=3201 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:32.104000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 21 00:58:32.108000 audit[3203]: NETFILTER_CFG table=filter:103 family=10 entries=3 op=nft_register_rule pid=3203 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Jan 21 00:58:32.108000 audit[3203]: SYSCALL arch=c000003e syscall=46 success=yes exit=2088 a0=3 a1=7ffc9b06ad10 a2=0 a3=7ffc9b06acfc items=0 ppid=3056 pid=3203 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:32.108000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 21 00:58:32.108000 audit[3203]: NETFILTER_CFG table=nat:104 family=10 entries=7 op=nft_register_chain pid=3203 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Jan 21 00:58:32.108000 audit[3203]: SYSCALL arch=c000003e syscall=46 success=yes exit=2056 a0=3 a1=7ffc9b06ad10 a2=0 a3=7ffc9b06acfc items=0 ppid=3056 pid=3203 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:32.108000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 21 00:58:33.158706 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount533476916.mount: Deactivated successfully. Jan 21 00:58:33.613313 containerd[1680]: time="2026-01-21T00:58:33.613253221Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 21 00:58:33.614513 containerd[1680]: time="2026-01-21T00:58:33.614348274Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.7: active requests=0, bytes read=23558205" Jan 21 00:58:33.615916 containerd[1680]: time="2026-01-21T00:58:33.615897117Z" level=info msg="ImageCreate event name:\"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 21 00:58:33.618845 containerd[1680]: time="2026-01-21T00:58:33.618803159Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 21 00:58:33.619642 containerd[1680]: time="2026-01-21T00:58:33.619385082Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.7\" with image id \"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\", repo tag \"quay.io/tigera/operator:v1.38.7\", repo digest \"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\", size \"25057686\" in 2.119051387s" Jan 21 00:58:33.619793 containerd[1680]: time="2026-01-21T00:58:33.619704263Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\" returns image reference \"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\"" Jan 21 00:58:33.622308 containerd[1680]: time="2026-01-21T00:58:33.622261748Z" level=info msg="CreateContainer within sandbox \"450c34364066109978fe09fa992ad6c2c4ecf711d9656e7efca042de0425d296\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Jan 21 00:58:33.635102 containerd[1680]: time="2026-01-21T00:58:33.634629910Z" level=info msg="Container 0ef9527e05f57cc37c51ccd4c63332f20a46a6f96b7a4fc1d2f35ca91a6882ca: CDI devices from CRI Config.CDIDevices: []" Jan 21 00:58:33.643408 containerd[1680]: time="2026-01-21T00:58:33.643377233Z" level=info msg="CreateContainer within sandbox \"450c34364066109978fe09fa992ad6c2c4ecf711d9656e7efca042de0425d296\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"0ef9527e05f57cc37c51ccd4c63332f20a46a6f96b7a4fc1d2f35ca91a6882ca\"" Jan 21 00:58:33.644111 containerd[1680]: time="2026-01-21T00:58:33.644086859Z" level=info msg="StartContainer for \"0ef9527e05f57cc37c51ccd4c63332f20a46a6f96b7a4fc1d2f35ca91a6882ca\"" Jan 21 00:58:33.644993 containerd[1680]: time="2026-01-21T00:58:33.644951956Z" level=info msg="connecting to shim 0ef9527e05f57cc37c51ccd4c63332f20a46a6f96b7a4fc1d2f35ca91a6882ca" address="unix:///run/containerd/s/3f6435668d8f6faeb8dddb52e46ca1cadaaecc6349796259469e8bb9996dd34d" protocol=ttrpc version=3 Jan 21 00:58:33.665868 systemd[1]: Started cri-containerd-0ef9527e05f57cc37c51ccd4c63332f20a46a6f96b7a4fc1d2f35ca91a6882ca.scope - libcontainer container 0ef9527e05f57cc37c51ccd4c63332f20a46a6f96b7a4fc1d2f35ca91a6882ca. Jan 21 00:58:33.675000 audit: BPF prog-id=146 op=LOAD Jan 21 00:58:33.675000 audit: BPF prog-id=147 op=LOAD Jan 21 00:58:33.675000 audit[3212]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a8238 a2=98 a3=0 items=0 ppid=2954 pid=3212 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:33.675000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3065663935323765303566353763633337633531636364346336333333 Jan 21 00:58:33.675000 audit: BPF prog-id=147 op=UNLOAD Jan 21 00:58:33.675000 audit[3212]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2954 pid=3212 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:33.675000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3065663935323765303566353763633337633531636364346336333333 Jan 21 00:58:33.676000 audit: BPF prog-id=148 op=LOAD Jan 21 00:58:33.676000 audit[3212]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a8488 a2=98 a3=0 items=0 ppid=2954 pid=3212 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:33.676000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3065663935323765303566353763633337633531636364346336333333 Jan 21 00:58:33.676000 audit: BPF prog-id=149 op=LOAD Jan 21 00:58:33.676000 audit[3212]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c0001a8218 a2=98 a3=0 items=0 ppid=2954 pid=3212 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:33.676000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3065663935323765303566353763633337633531636364346336333333 Jan 21 00:58:33.676000 audit: BPF prog-id=149 op=UNLOAD Jan 21 00:58:33.676000 audit[3212]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2954 pid=3212 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:33.676000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3065663935323765303566353763633337633531636364346336333333 Jan 21 00:58:33.676000 audit: BPF prog-id=148 op=UNLOAD Jan 21 00:58:33.676000 audit[3212]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2954 pid=3212 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:33.676000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3065663935323765303566353763633337633531636364346336333333 Jan 21 00:58:33.676000 audit: BPF prog-id=150 op=LOAD Jan 21 00:58:33.676000 audit[3212]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a86e8 a2=98 a3=0 items=0 ppid=2954 pid=3212 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:33.676000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3065663935323765303566353763633337633531636364346336333333 Jan 21 00:58:33.693141 containerd[1680]: time="2026-01-21T00:58:33.693114707Z" level=info msg="StartContainer for \"0ef9527e05f57cc37c51ccd4c63332f20a46a6f96b7a4fc1d2f35ca91a6882ca\" returns successfully" Jan 21 00:58:34.248921 kubelet[2901]: I0121 00:58:34.248790 2901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-7dcd859c48-fvzpt" podStartSLOduration=1.128361499 podStartE2EDuration="3.248771926s" podCreationTimestamp="2026-01-21 00:58:31 +0000 UTC" firstStartedPulling="2026-01-21 00:58:31.499989841 +0000 UTC m=+6.440605438" lastFinishedPulling="2026-01-21 00:58:33.620400263 +0000 UTC m=+8.561015865" observedRunningTime="2026-01-21 00:58:34.248467848 +0000 UTC m=+9.189083467" watchObservedRunningTime="2026-01-21 00:58:34.248771926 +0000 UTC m=+9.189387537" Jan 21 00:58:34.249298 kubelet[2901]: I0121 00:58:34.249033 2901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-rjj78" podStartSLOduration=4.24902651 podStartE2EDuration="4.24902651s" podCreationTimestamp="2026-01-21 00:58:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:58:32.245846432 +0000 UTC m=+7.186462050" watchObservedRunningTime="2026-01-21 00:58:34.24902651 +0000 UTC m=+9.189642129" Jan 21 00:58:39.210156 sudo[1946]: pam_unix(sudo:session): session closed for user root Jan 21 00:58:39.208000 audit[1946]: USER_END pid=1946 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 21 00:58:39.211102 kernel: kauditd_printk_skb: 224 callbacks suppressed Jan 21 00:58:39.211159 kernel: audit: type=1106 audit(1768957119.208:520): pid=1946 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 21 00:58:39.210000 audit[1946]: CRED_DISP pid=1946 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 21 00:58:39.219699 kernel: audit: type=1104 audit(1768957119.210:521): pid=1946 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 21 00:58:39.311942 sshd[1945]: Connection closed by 4.153.228.146 port 47426 Jan 21 00:58:39.312630 sshd-session[1941]: pam_unix(sshd:session): session closed for user core Jan 21 00:58:39.312000 audit[1941]: USER_END pid=1941 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 00:58:39.319703 kernel: audit: type=1106 audit(1768957119.312:522): pid=1941 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 00:58:39.322001 systemd[1]: sshd@8-10.0.5.74:22-4.153.228.146:47426.service: Deactivated successfully. Jan 21 00:58:39.312000 audit[1941]: CRED_DISP pid=1941 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 00:58:39.327710 kernel: audit: type=1104 audit(1768957119.312:523): pid=1941 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 00:58:39.328922 systemd[1]: session-10.scope: Deactivated successfully. Jan 21 00:58:39.329188 systemd[1]: session-10.scope: Consumed 3.578s CPU time, 229.3M memory peak. Jan 21 00:58:39.320000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-10.0.5.74:22-4.153.228.146:47426 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:58:39.333739 kernel: audit: type=1131 audit(1768957119.320:524): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-10.0.5.74:22-4.153.228.146:47426 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 00:58:39.334097 systemd-logind[1653]: Session 10 logged out. Waiting for processes to exit. Jan 21 00:58:39.337469 systemd-logind[1653]: Removed session 10. Jan 21 00:58:40.062000 audit[3292]: NETFILTER_CFG table=filter:105 family=2 entries=15 op=nft_register_rule pid=3292 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 21 00:58:40.067730 kernel: audit: type=1325 audit(1768957120.062:525): table=filter:105 family=2 entries=15 op=nft_register_rule pid=3292 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 21 00:58:40.062000 audit[3292]: SYSCALL arch=c000003e syscall=46 success=yes exit=5992 a0=3 a1=7ffeec336890 a2=0 a3=7ffeec33687c items=0 ppid=3056 pid=3292 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:40.072703 kernel: audit: type=1300 audit(1768957120.062:525): arch=c000003e syscall=46 success=yes exit=5992 a0=3 a1=7ffeec336890 a2=0 a3=7ffeec33687c items=0 ppid=3056 pid=3292 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:40.062000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 21 00:58:40.077701 kernel: audit: type=1327 audit(1768957120.062:525): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 21 00:58:40.077756 kernel: audit: type=1325 audit(1768957120.072:526): table=nat:106 family=2 entries=12 op=nft_register_rule pid=3292 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 21 00:58:40.072000 audit[3292]: NETFILTER_CFG table=nat:106 family=2 entries=12 op=nft_register_rule pid=3292 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 21 00:58:40.072000 audit[3292]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffeec336890 a2=0 a3=0 items=0 ppid=3056 pid=3292 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:40.085717 kernel: audit: type=1300 audit(1768957120.072:526): arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffeec336890 a2=0 a3=0 items=0 ppid=3056 pid=3292 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:40.072000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 21 00:58:40.089000 audit[3294]: NETFILTER_CFG table=filter:107 family=2 entries=16 op=nft_register_rule pid=3294 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 21 00:58:40.089000 audit[3294]: SYSCALL arch=c000003e syscall=46 success=yes exit=5992 a0=3 a1=7ffe7c585bb0 a2=0 a3=7ffe7c585b9c items=0 ppid=3056 pid=3294 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:40.089000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 21 00:58:40.093000 audit[3294]: NETFILTER_CFG table=nat:108 family=2 entries=12 op=nft_register_rule pid=3294 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 21 00:58:40.093000 audit[3294]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffe7c585bb0 a2=0 a3=0 items=0 ppid=3056 pid=3294 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:40.093000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 21 00:58:41.657000 audit[3296]: NETFILTER_CFG table=filter:109 family=2 entries=17 op=nft_register_rule pid=3296 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 21 00:58:41.657000 audit[3296]: SYSCALL arch=c000003e syscall=46 success=yes exit=6736 a0=3 a1=7ffd07e5dec0 a2=0 a3=7ffd07e5deac items=0 ppid=3056 pid=3296 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:41.657000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 21 00:58:41.662000 audit[3296]: NETFILTER_CFG table=nat:110 family=2 entries=12 op=nft_register_rule pid=3296 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 21 00:58:41.662000 audit[3296]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffd07e5dec0 a2=0 a3=0 items=0 ppid=3056 pid=3296 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:41.662000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 21 00:58:42.678000 audit[3298]: NETFILTER_CFG table=filter:111 family=2 entries=19 op=nft_register_rule pid=3298 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 21 00:58:42.678000 audit[3298]: SYSCALL arch=c000003e syscall=46 success=yes exit=7480 a0=3 a1=7ffdd8230770 a2=0 a3=7ffdd823075c items=0 ppid=3056 pid=3298 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:42.678000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 21 00:58:42.682000 audit[3298]: NETFILTER_CFG table=nat:112 family=2 entries=12 op=nft_register_rule pid=3298 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 21 00:58:42.682000 audit[3298]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffdd8230770 a2=0 a3=0 items=0 ppid=3056 pid=3298 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:42.682000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 21 00:58:43.317000 audit[3300]: NETFILTER_CFG table=filter:113 family=2 entries=21 op=nft_register_rule pid=3300 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 21 00:58:43.317000 audit[3300]: SYSCALL arch=c000003e syscall=46 success=yes exit=8224 a0=3 a1=7ffe5251caa0 a2=0 a3=7ffe5251ca8c items=0 ppid=3056 pid=3300 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:43.317000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 21 00:58:43.325000 audit[3300]: NETFILTER_CFG table=nat:114 family=2 entries=12 op=nft_register_rule pid=3300 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 21 00:58:43.325000 audit[3300]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffe5251caa0 a2=0 a3=0 items=0 ppid=3056 pid=3300 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:43.325000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 21 00:58:43.358582 systemd[1]: Created slice kubepods-besteffort-pod77cc8a44_a428_4b01_bb84_4901934a7f50.slice - libcontainer container kubepods-besteffort-pod77cc8a44_a428_4b01_bb84_4901934a7f50.slice. Jan 21 00:58:43.412667 kubelet[2901]: I0121 00:58:43.412636 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hgc8\" (UniqueName: \"kubernetes.io/projected/77cc8a44-a428-4b01-bb84-4901934a7f50-kube-api-access-8hgc8\") pod \"calico-typha-68dd87d7b4-bbf8k\" (UID: \"77cc8a44-a428-4b01-bb84-4901934a7f50\") " pod="calico-system/calico-typha-68dd87d7b4-bbf8k" Jan 21 00:58:43.412667 kubelet[2901]: I0121 00:58:43.412669 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77cc8a44-a428-4b01-bb84-4901934a7f50-tigera-ca-bundle\") pod \"calico-typha-68dd87d7b4-bbf8k\" (UID: \"77cc8a44-a428-4b01-bb84-4901934a7f50\") " pod="calico-system/calico-typha-68dd87d7b4-bbf8k" Jan 21 00:58:43.413124 kubelet[2901]: I0121 00:58:43.412703 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/77cc8a44-a428-4b01-bb84-4901934a7f50-typha-certs\") pod \"calico-typha-68dd87d7b4-bbf8k\" (UID: \"77cc8a44-a428-4b01-bb84-4901934a7f50\") " pod="calico-system/calico-typha-68dd87d7b4-bbf8k" Jan 21 00:58:43.552472 systemd[1]: Created slice kubepods-besteffort-podfff7668a_1e6d_4df8_8d2a_dc72b548d806.slice - libcontainer container kubepods-besteffort-podfff7668a_1e6d_4df8_8d2a_dc72b548d806.slice. Jan 21 00:58:43.615035 kubelet[2901]: I0121 00:58:43.614613 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/fff7668a-1e6d-4df8-8d2a-dc72b548d806-cni-net-dir\") pod \"calico-node-m2vxz\" (UID: \"fff7668a-1e6d-4df8-8d2a-dc72b548d806\") " pod="calico-system/calico-node-m2vxz" Jan 21 00:58:43.615035 kubelet[2901]: I0121 00:58:43.614720 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/fff7668a-1e6d-4df8-8d2a-dc72b548d806-xtables-lock\") pod \"calico-node-m2vxz\" (UID: \"fff7668a-1e6d-4df8-8d2a-dc72b548d806\") " pod="calico-system/calico-node-m2vxz" Jan 21 00:58:43.615035 kubelet[2901]: I0121 00:58:43.614741 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/fff7668a-1e6d-4df8-8d2a-dc72b548d806-node-certs\") pod \"calico-node-m2vxz\" (UID: \"fff7668a-1e6d-4df8-8d2a-dc72b548d806\") " pod="calico-system/calico-node-m2vxz" Jan 21 00:58:43.615035 kubelet[2901]: I0121 00:58:43.614757 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/fff7668a-1e6d-4df8-8d2a-dc72b548d806-flexvol-driver-host\") pod \"calico-node-m2vxz\" (UID: \"fff7668a-1e6d-4df8-8d2a-dc72b548d806\") " pod="calico-system/calico-node-m2vxz" Jan 21 00:58:43.615533 kubelet[2901]: I0121 00:58:43.615339 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fff7668a-1e6d-4df8-8d2a-dc72b548d806-tigera-ca-bundle\") pod \"calico-node-m2vxz\" (UID: \"fff7668a-1e6d-4df8-8d2a-dc72b548d806\") " pod="calico-system/calico-node-m2vxz" Jan 21 00:58:43.615533 kubelet[2901]: I0121 00:58:43.615407 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/fff7668a-1e6d-4df8-8d2a-dc72b548d806-var-lib-calico\") pod \"calico-node-m2vxz\" (UID: \"fff7668a-1e6d-4df8-8d2a-dc72b548d806\") " pod="calico-system/calico-node-m2vxz" Jan 21 00:58:43.615533 kubelet[2901]: I0121 00:58:43.615421 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5b7gz\" (UniqueName: \"kubernetes.io/projected/fff7668a-1e6d-4df8-8d2a-dc72b548d806-kube-api-access-5b7gz\") pod \"calico-node-m2vxz\" (UID: \"fff7668a-1e6d-4df8-8d2a-dc72b548d806\") " pod="calico-system/calico-node-m2vxz" Jan 21 00:58:43.615533 kubelet[2901]: I0121 00:58:43.615437 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/fff7668a-1e6d-4df8-8d2a-dc72b548d806-cni-bin-dir\") pod \"calico-node-m2vxz\" (UID: \"fff7668a-1e6d-4df8-8d2a-dc72b548d806\") " pod="calico-system/calico-node-m2vxz" Jan 21 00:58:43.615533 kubelet[2901]: I0121 00:58:43.615462 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/fff7668a-1e6d-4df8-8d2a-dc72b548d806-policysync\") pod \"calico-node-m2vxz\" (UID: \"fff7668a-1e6d-4df8-8d2a-dc72b548d806\") " pod="calico-system/calico-node-m2vxz" Jan 21 00:58:43.615701 kubelet[2901]: I0121 00:58:43.615475 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fff7668a-1e6d-4df8-8d2a-dc72b548d806-lib-modules\") pod \"calico-node-m2vxz\" (UID: \"fff7668a-1e6d-4df8-8d2a-dc72b548d806\") " pod="calico-system/calico-node-m2vxz" Jan 21 00:58:43.615701 kubelet[2901]: I0121 00:58:43.615488 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/fff7668a-1e6d-4df8-8d2a-dc72b548d806-cni-log-dir\") pod \"calico-node-m2vxz\" (UID: \"fff7668a-1e6d-4df8-8d2a-dc72b548d806\") " pod="calico-system/calico-node-m2vxz" Jan 21 00:58:43.615701 kubelet[2901]: I0121 00:58:43.615503 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/fff7668a-1e6d-4df8-8d2a-dc72b548d806-var-run-calico\") pod \"calico-node-m2vxz\" (UID: \"fff7668a-1e6d-4df8-8d2a-dc72b548d806\") " pod="calico-system/calico-node-m2vxz" Jan 21 00:58:43.663126 containerd[1680]: time="2026-01-21T00:58:43.662265209Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-68dd87d7b4-bbf8k,Uid:77cc8a44-a428-4b01-bb84-4901934a7f50,Namespace:calico-system,Attempt:0,}" Jan 21 00:58:43.690390 containerd[1680]: time="2026-01-21T00:58:43.690355290Z" level=info msg="connecting to shim 08134b04a5e404cc1545fd6445e4237aa643f82fdcb39b5cad17f1c72406f091" address="unix:///run/containerd/s/26a948851580aa90a986cf65617a1c6652ab652e4dbcb44ce3aaee62ce59c212" namespace=k8s.io protocol=ttrpc version=3 Jan 21 00:58:43.722883 kubelet[2901]: E0121 00:58:43.722860 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.723037 kubelet[2901]: W0121 00:58:43.723025 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.723113 kubelet[2901]: E0121 00:58:43.723104 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.723303 kubelet[2901]: E0121 00:58:43.723297 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.723354 kubelet[2901]: W0121 00:58:43.723348 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.723393 kubelet[2901]: E0121 00:58:43.723387 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.723557 kubelet[2901]: E0121 00:58:43.723551 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.723610 kubelet[2901]: W0121 00:58:43.723603 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.723646 kubelet[2901]: E0121 00:58:43.723640 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.723865 kubelet[2901]: E0121 00:58:43.723858 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.724118 kubelet[2901]: W0121 00:58:43.724107 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.724202 kubelet[2901]: E0121 00:58:43.724194 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.725648 kubelet[2901]: E0121 00:58:43.725611 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.725648 kubelet[2901]: W0121 00:58:43.725622 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.725803 kubelet[2901]: E0121 00:58:43.725638 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.725923 kubelet[2901]: E0121 00:58:43.725907 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.725965 kubelet[2901]: W0121 00:58:43.725955 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.726051 kubelet[2901]: E0121 00:58:43.726003 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.726308 kubelet[2901]: E0121 00:58:43.726302 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.726353 kubelet[2901]: W0121 00:58:43.726343 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.726791 kubelet[2901]: E0121 00:58:43.726383 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.729752 kubelet[2901]: E0121 00:58:43.729741 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.729896 kubelet[2901]: W0121 00:58:43.729815 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.729896 kubelet[2901]: E0121 00:58:43.729829 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.730054 kubelet[2901]: E0121 00:58:43.730048 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.730115 kubelet[2901]: W0121 00:58:43.730088 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.730115 kubelet[2901]: E0121 00:58:43.730097 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.730542 kubelet[2901]: E0121 00:58:43.730515 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.730542 kubelet[2901]: W0121 00:58:43.730524 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.730829 kubelet[2901]: E0121 00:58:43.730533 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.731769 kubelet[2901]: E0121 00:58:43.731584 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.731769 kubelet[2901]: W0121 00:58:43.731595 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.731769 kubelet[2901]: E0121 00:58:43.731606 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.732833 kubelet[2901]: E0121 00:58:43.732822 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.733184 kubelet[2901]: W0121 00:58:43.733091 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.733184 kubelet[2901]: E0121 00:58:43.733106 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.733644 kubelet[2901]: E0121 00:58:43.733595 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.733644 kubelet[2901]: W0121 00:58:43.733604 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.733644 kubelet[2901]: E0121 00:58:43.733613 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.734000 systemd[1]: Started cri-containerd-08134b04a5e404cc1545fd6445e4237aa643f82fdcb39b5cad17f1c72406f091.scope - libcontainer container 08134b04a5e404cc1545fd6445e4237aa643f82fdcb39b5cad17f1c72406f091. Jan 21 00:58:43.735016 kubelet[2901]: E0121 00:58:43.735006 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.735304 kubelet[2901]: W0121 00:58:43.735162 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.735304 kubelet[2901]: E0121 00:58:43.735237 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.736981 kubelet[2901]: E0121 00:58:43.736414 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.742016 kubelet[2901]: W0121 00:58:43.741478 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.742016 kubelet[2901]: E0121 00:58:43.741961 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.742016 kubelet[2901]: W0121 00:58:43.741970 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.743378 kubelet[2901]: E0121 00:58:43.743357 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.743659 kubelet[2901]: E0121 00:58:43.743548 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.745214 kubelet[2901]: E0121 00:58:43.745018 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.745214 kubelet[2901]: W0121 00:58:43.745031 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.745881 kubelet[2901]: E0121 00:58:43.745679 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.745881 kubelet[2901]: W0121 00:58:43.745706 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.746216 kubelet[2901]: E0121 00:58:43.746201 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.746216 kubelet[2901]: W0121 00:58:43.746213 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.746670 kubelet[2901]: E0121 00:58:43.746532 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.746670 kubelet[2901]: W0121 00:58:43.746542 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.746670 kubelet[2901]: E0121 00:58:43.746552 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.746961 kubelet[2901]: E0121 00:58:43.746827 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.747309 kubelet[2901]: E0121 00:58:43.747289 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.747517 kubelet[2901]: W0121 00:58:43.747418 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.747517 kubelet[2901]: E0121 00:58:43.747431 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.748433 kubelet[2901]: E0121 00:58:43.748379 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.748433 kubelet[2901]: W0121 00:58:43.748390 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.748433 kubelet[2901]: E0121 00:58:43.748401 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.750329 kubelet[2901]: E0121 00:58:43.750212 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.750329 kubelet[2901]: W0121 00:58:43.750227 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.750560 kubelet[2901]: E0121 00:58:43.750242 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.750560 kubelet[2901]: E0121 00:58:43.750440 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.751029 kubelet[2901]: E0121 00:58:43.751018 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.751775 kubelet[2901]: E0121 00:58:43.751670 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.751775 kubelet[2901]: W0121 00:58:43.751698 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.751775 kubelet[2901]: E0121 00:58:43.751709 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.752512 kubelet[2901]: E0121 00:58:43.752421 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.752512 kubelet[2901]: W0121 00:58:43.752433 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.752512 kubelet[2901]: E0121 00:58:43.752444 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.753695 kubelet[2901]: E0121 00:58:43.753328 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.753695 kubelet[2901]: W0121 00:58:43.753341 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.753695 kubelet[2901]: E0121 00:58:43.753353 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.761876 kubelet[2901]: E0121 00:58:43.761620 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8lkb9" podUID="851d5829-334a-4f46-97de-87be973a0b77" Jan 21 00:58:43.762080 kubelet[2901]: E0121 00:58:43.761879 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.762276 kubelet[2901]: W0121 00:58:43.762134 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.762276 kubelet[2901]: E0121 00:58:43.762153 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.780000 audit: BPF prog-id=151 op=LOAD Jan 21 00:58:43.781000 audit: BPF prog-id=152 op=LOAD Jan 21 00:58:43.781000 audit[3322]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000186238 a2=98 a3=0 items=0 ppid=3311 pid=3322 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:43.781000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3038313334623034613565343034636331353435666436343435653432 Jan 21 00:58:43.781000 audit: BPF prog-id=152 op=UNLOAD Jan 21 00:58:43.781000 audit[3322]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3311 pid=3322 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:43.781000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3038313334623034613565343034636331353435666436343435653432 Jan 21 00:58:43.781000 audit: BPF prog-id=153 op=LOAD Jan 21 00:58:43.781000 audit[3322]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000186488 a2=98 a3=0 items=0 ppid=3311 pid=3322 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:43.781000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3038313334623034613565343034636331353435666436343435653432 Jan 21 00:58:43.781000 audit: BPF prog-id=154 op=LOAD Jan 21 00:58:43.781000 audit[3322]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000186218 a2=98 a3=0 items=0 ppid=3311 pid=3322 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:43.781000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3038313334623034613565343034636331353435666436343435653432 Jan 21 00:58:43.781000 audit: BPF prog-id=154 op=UNLOAD Jan 21 00:58:43.781000 audit[3322]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3311 pid=3322 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:43.781000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3038313334623034613565343034636331353435666436343435653432 Jan 21 00:58:43.781000 audit: BPF prog-id=153 op=UNLOAD Jan 21 00:58:43.781000 audit[3322]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3311 pid=3322 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:43.781000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3038313334623034613565343034636331353435666436343435653432 Jan 21 00:58:43.781000 audit: BPF prog-id=155 op=LOAD Jan 21 00:58:43.781000 audit[3322]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001866e8 a2=98 a3=0 items=0 ppid=3311 pid=3322 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:43.781000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3038313334623034613565343034636331353435666436343435653432 Jan 21 00:58:43.802174 kubelet[2901]: E0121 00:58:43.802137 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.802480 kubelet[2901]: W0121 00:58:43.802159 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.802480 kubelet[2901]: E0121 00:58:43.802432 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.802915 kubelet[2901]: E0121 00:58:43.802842 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.802915 kubelet[2901]: W0121 00:58:43.802853 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.802915 kubelet[2901]: E0121 00:58:43.802865 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.803271 kubelet[2901]: E0121 00:58:43.803221 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.803420 kubelet[2901]: W0121 00:58:43.803308 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.803420 kubelet[2901]: E0121 00:58:43.803320 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.803851 kubelet[2901]: E0121 00:58:43.803712 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.803851 kubelet[2901]: W0121 00:58:43.803721 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.803851 kubelet[2901]: E0121 00:58:43.803730 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.804168 kubelet[2901]: E0121 00:58:43.804141 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.804168 kubelet[2901]: W0121 00:58:43.804149 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.804353 kubelet[2901]: E0121 00:58:43.804251 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.804629 kubelet[2901]: E0121 00:58:43.804513 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.804629 kubelet[2901]: W0121 00:58:43.804520 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.804629 kubelet[2901]: E0121 00:58:43.804527 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.805088 kubelet[2901]: E0121 00:58:43.805038 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.805385 kubelet[2901]: W0121 00:58:43.805320 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.805385 kubelet[2901]: E0121 00:58:43.805336 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.805620 kubelet[2901]: E0121 00:58:43.805612 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.805678 kubelet[2901]: W0121 00:58:43.805649 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.805678 kubelet[2901]: E0121 00:58:43.805658 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.806275 kubelet[2901]: E0121 00:58:43.806235 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.806275 kubelet[2901]: W0121 00:58:43.806243 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.806275 kubelet[2901]: E0121 00:58:43.806250 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.806519 kubelet[2901]: E0121 00:58:43.806482 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.806519 kubelet[2901]: W0121 00:58:43.806489 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.806519 kubelet[2901]: E0121 00:58:43.806495 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.806893 kubelet[2901]: E0121 00:58:43.806816 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.806893 kubelet[2901]: W0121 00:58:43.806825 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.806893 kubelet[2901]: E0121 00:58:43.806832 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.807528 kubelet[2901]: E0121 00:58:43.807474 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.807528 kubelet[2901]: W0121 00:58:43.807483 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.807528 kubelet[2901]: E0121 00:58:43.807491 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.807773 kubelet[2901]: E0121 00:58:43.807735 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.807773 kubelet[2901]: W0121 00:58:43.807742 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.807773 kubelet[2901]: E0121 00:58:43.807749 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.807991 kubelet[2901]: E0121 00:58:43.807957 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.807991 kubelet[2901]: W0121 00:58:43.807964 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.807991 kubelet[2901]: E0121 00:58:43.807970 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.808895 kubelet[2901]: E0121 00:58:43.808880 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.808925 kubelet[2901]: W0121 00:58:43.808896 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.808925 kubelet[2901]: E0121 00:58:43.808910 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.809056 kubelet[2901]: E0121 00:58:43.809047 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.809082 kubelet[2901]: W0121 00:58:43.809057 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.809082 kubelet[2901]: E0121 00:58:43.809064 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.809205 kubelet[2901]: E0121 00:58:43.809197 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.809225 kubelet[2901]: W0121 00:58:43.809217 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.809249 kubelet[2901]: E0121 00:58:43.809223 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.809344 kubelet[2901]: E0121 00:58:43.809336 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.809367 kubelet[2901]: W0121 00:58:43.809344 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.809367 kubelet[2901]: E0121 00:58:43.809349 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.809474 kubelet[2901]: E0121 00:58:43.809467 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.809567 kubelet[2901]: W0121 00:58:43.809474 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.809594 kubelet[2901]: E0121 00:58:43.809568 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.809724 kubelet[2901]: E0121 00:58:43.809715 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.809747 kubelet[2901]: W0121 00:58:43.809726 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.809747 kubelet[2901]: E0121 00:58:43.809733 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.817155 kubelet[2901]: E0121 00:58:43.817024 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.817155 kubelet[2901]: W0121 00:58:43.817042 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.817155 kubelet[2901]: E0121 00:58:43.817057 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.817155 kubelet[2901]: I0121 00:58:43.817081 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/851d5829-334a-4f46-97de-87be973a0b77-varrun\") pod \"csi-node-driver-8lkb9\" (UID: \"851d5829-334a-4f46-97de-87be973a0b77\") " pod="calico-system/csi-node-driver-8lkb9" Jan 21 00:58:43.817607 kubelet[2901]: E0121 00:58:43.817349 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.817705 kubelet[2901]: W0121 00:58:43.817665 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.817786 kubelet[2901]: E0121 00:58:43.817696 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.817786 kubelet[2901]: I0121 00:58:43.817761 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/851d5829-334a-4f46-97de-87be973a0b77-registration-dir\") pod \"csi-node-driver-8lkb9\" (UID: \"851d5829-334a-4f46-97de-87be973a0b77\") " pod="calico-system/csi-node-driver-8lkb9" Jan 21 00:58:43.817986 kubelet[2901]: E0121 00:58:43.817969 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.817986 kubelet[2901]: W0121 00:58:43.817984 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.818042 kubelet[2901]: E0121 00:58:43.818000 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.818172 kubelet[2901]: E0121 00:58:43.818131 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.818172 kubelet[2901]: W0121 00:58:43.818141 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.818172 kubelet[2901]: E0121 00:58:43.818148 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.818377 kubelet[2901]: E0121 00:58:43.818366 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.818377 kubelet[2901]: W0121 00:58:43.818375 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.818424 kubelet[2901]: E0121 00:58:43.818391 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.818424 kubelet[2901]: I0121 00:58:43.818409 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/851d5829-334a-4f46-97de-87be973a0b77-socket-dir\") pod \"csi-node-driver-8lkb9\" (UID: \"851d5829-334a-4f46-97de-87be973a0b77\") " pod="calico-system/csi-node-driver-8lkb9" Jan 21 00:58:43.818977 kubelet[2901]: E0121 00:58:43.818932 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.818977 kubelet[2901]: W0121 00:58:43.818946 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.818977 kubelet[2901]: E0121 00:58:43.818965 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.819079 kubelet[2901]: I0121 00:58:43.818980 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/851d5829-334a-4f46-97de-87be973a0b77-kubelet-dir\") pod \"csi-node-driver-8lkb9\" (UID: \"851d5829-334a-4f46-97de-87be973a0b77\") " pod="calico-system/csi-node-driver-8lkb9" Jan 21 00:58:43.819717 kubelet[2901]: E0121 00:58:43.819164 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.819717 kubelet[2901]: W0121 00:58:43.819171 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.819717 kubelet[2901]: E0121 00:58:43.819468 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.819717 kubelet[2901]: I0121 00:58:43.819488 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvxfg\" (UniqueName: \"kubernetes.io/projected/851d5829-334a-4f46-97de-87be973a0b77-kube-api-access-kvxfg\") pod \"csi-node-driver-8lkb9\" (UID: \"851d5829-334a-4f46-97de-87be973a0b77\") " pod="calico-system/csi-node-driver-8lkb9" Jan 21 00:58:43.819717 kubelet[2901]: E0121 00:58:43.819542 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.819717 kubelet[2901]: W0121 00:58:43.819548 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.819860 kubelet[2901]: E0121 00:58:43.819744 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.819883 kubelet[2901]: E0121 00:58:43.819872 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.819883 kubelet[2901]: W0121 00:58:43.819879 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.819938 kubelet[2901]: E0121 00:58:43.819891 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.820113 kubelet[2901]: E0121 00:58:43.820102 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.820113 kubelet[2901]: W0121 00:58:43.820111 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.820169 kubelet[2901]: E0121 00:58:43.820121 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.820404 kubelet[2901]: E0121 00:58:43.820392 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.820404 kubelet[2901]: W0121 00:58:43.820402 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.820484 kubelet[2901]: E0121 00:58:43.820414 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.820588 kubelet[2901]: E0121 00:58:43.820575 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.820588 kubelet[2901]: W0121 00:58:43.820583 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.820639 kubelet[2901]: E0121 00:58:43.820609 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.820865 kubelet[2901]: E0121 00:58:43.820853 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.820865 kubelet[2901]: W0121 00:58:43.820863 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.820934 kubelet[2901]: E0121 00:58:43.820870 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.821098 kubelet[2901]: E0121 00:58:43.821086 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.821098 kubelet[2901]: W0121 00:58:43.821096 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.821250 kubelet[2901]: E0121 00:58:43.821103 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.821353 kubelet[2901]: E0121 00:58:43.821311 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.821353 kubelet[2901]: W0121 00:58:43.821320 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.821353 kubelet[2901]: E0121 00:58:43.821327 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.850142 containerd[1680]: time="2026-01-21T00:58:43.850080518Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-68dd87d7b4-bbf8k,Uid:77cc8a44-a428-4b01-bb84-4901934a7f50,Namespace:calico-system,Attempt:0,} returns sandbox id \"08134b04a5e404cc1545fd6445e4237aa643f82fdcb39b5cad17f1c72406f091\"" Jan 21 00:58:43.851159 containerd[1680]: time="2026-01-21T00:58:43.851141178Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\"" Jan 21 00:58:43.855590 containerd[1680]: time="2026-01-21T00:58:43.855467896Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-m2vxz,Uid:fff7668a-1e6d-4df8-8d2a-dc72b548d806,Namespace:calico-system,Attempt:0,}" Jan 21 00:58:43.886718 containerd[1680]: time="2026-01-21T00:58:43.886494820Z" level=info msg="connecting to shim ff3104a46f20fde544d2a09cd0f2ae92f4b166ddb5c3f82fc868ab034ff4f89f" address="unix:///run/containerd/s/5d5ee1b77ec652855177580ca59af291888096523a75d311b38a2ae908263b56" namespace=k8s.io protocol=ttrpc version=3 Jan 21 00:58:43.912917 systemd[1]: Started cri-containerd-ff3104a46f20fde544d2a09cd0f2ae92f4b166ddb5c3f82fc868ab034ff4f89f.scope - libcontainer container ff3104a46f20fde544d2a09cd0f2ae92f4b166ddb5c3f82fc868ab034ff4f89f. Jan 21 00:58:43.920460 kubelet[2901]: E0121 00:58:43.920413 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.920460 kubelet[2901]: W0121 00:58:43.920432 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.920724 kubelet[2901]: E0121 00:58:43.920549 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.920985 kubelet[2901]: E0121 00:58:43.920976 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.921096 kubelet[2901]: W0121 00:58:43.921018 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.921096 kubelet[2901]: E0121 00:58:43.921037 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.921329 kubelet[2901]: E0121 00:58:43.921272 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.921329 kubelet[2901]: W0121 00:58:43.921279 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.921329 kubelet[2901]: E0121 00:58:43.921292 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.921949 kubelet[2901]: E0121 00:58:43.921930 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.922005 kubelet[2901]: W0121 00:58:43.921949 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.922005 kubelet[2901]: E0121 00:58:43.921969 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.922513 kubelet[2901]: E0121 00:58:43.922498 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.922715 kubelet[2901]: W0121 00:58:43.922700 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.922782 kubelet[2901]: E0121 00:58:43.922767 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.923222 kubelet[2901]: E0121 00:58:43.923203 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.923222 kubelet[2901]: W0121 00:58:43.923213 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.923222 kubelet[2901]: E0121 00:58:43.923246 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.923590 kubelet[2901]: E0121 00:58:43.923580 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.923590 kubelet[2901]: W0121 00:58:43.923590 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.923705 kubelet[2901]: E0121 00:58:43.923643 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.923890 kubelet[2901]: E0121 00:58:43.923880 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.923919 kubelet[2901]: W0121 00:58:43.923890 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.923959 kubelet[2901]: E0121 00:58:43.923946 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.924420 kubelet[2901]: E0121 00:58:43.924409 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.924491 kubelet[2901]: W0121 00:58:43.924421 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.924491 kubelet[2901]: E0121 00:58:43.924465 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.924588 kubelet[2901]: E0121 00:58:43.924579 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.924646 kubelet[2901]: W0121 00:58:43.924587 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.924646 kubelet[2901]: E0121 00:58:43.924617 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.924734 kubelet[2901]: E0121 00:58:43.924725 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.924734 kubelet[2901]: W0121 00:58:43.924733 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.924780 kubelet[2901]: E0121 00:58:43.924746 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.924991 kubelet[2901]: E0121 00:58:43.924983 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.925035 kubelet[2901]: W0121 00:58:43.925023 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.925077 kubelet[2901]: E0121 00:58:43.925071 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.925255 kubelet[2901]: E0121 00:58:43.925240 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.925255 kubelet[2901]: W0121 00:58:43.925246 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.925366 kubelet[2901]: E0121 00:58:43.925312 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.925452 kubelet[2901]: E0121 00:58:43.925447 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.925611 kubelet[2901]: W0121 00:58:43.925504 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.925611 kubelet[2901]: E0121 00:58:43.925517 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.925734 kubelet[2901]: E0121 00:58:43.925722 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.925758 kubelet[2901]: W0121 00:58:43.925734 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.925758 kubelet[2901]: E0121 00:58:43.925748 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.926764 kubelet[2901]: E0121 00:58:43.926753 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.926793 kubelet[2901]: W0121 00:58:43.926764 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.926865 kubelet[2901]: E0121 00:58:43.926834 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.926937 kubelet[2901]: E0121 00:58:43.926915 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.926937 kubelet[2901]: W0121 00:58:43.926924 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.927079 kubelet[2901]: E0121 00:58:43.926956 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.927079 kubelet[2901]: E0121 00:58:43.927051 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.927079 kubelet[2901]: W0121 00:58:43.927057 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.927232 kubelet[2901]: E0121 00:58:43.927159 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.927232 kubelet[2901]: W0121 00:58:43.927164 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.927481 kubelet[2901]: E0121 00:58:43.927438 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.927481 kubelet[2901]: W0121 00:58:43.927448 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.927573 kubelet[2901]: E0121 00:58:43.927558 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.927648 kubelet[2901]: E0121 00:58:43.927563 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.927648 kubelet[2901]: W0121 00:58:43.927622 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.927648 kubelet[2901]: E0121 00:58:43.927630 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.927835 kubelet[2901]: E0121 00:58:43.927569 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.927835 kubelet[2901]: E0121 00:58:43.927579 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.927985 kubelet[2901]: E0121 00:58:43.927979 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.928019 kubelet[2901]: W0121 00:58:43.928014 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.928123 kubelet[2901]: E0121 00:58:43.928097 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.928279 kubelet[2901]: E0121 00:58:43.928268 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.928304 kubelet[2901]: W0121 00:58:43.928279 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.928304 kubelet[2901]: E0121 00:58:43.928294 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.928556 kubelet[2901]: E0121 00:58:43.928548 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.928585 kubelet[2901]: W0121 00:58:43.928557 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.928585 kubelet[2901]: E0121 00:58:43.928565 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.929064 kubelet[2901]: E0121 00:58:43.929053 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.929064 kubelet[2901]: W0121 00:58:43.929063 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.929113 kubelet[2901]: E0121 00:58:43.929072 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.929000 audit: BPF prog-id=156 op=LOAD Jan 21 00:58:43.930000 audit: BPF prog-id=157 op=LOAD Jan 21 00:58:43.930000 audit[3441]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=3429 pid=3441 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:43.930000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6666333130346134366632306664653534346432613039636430663261 Jan 21 00:58:43.930000 audit: BPF prog-id=157 op=UNLOAD Jan 21 00:58:43.930000 audit[3441]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3429 pid=3441 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:43.930000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6666333130346134366632306664653534346432613039636430663261 Jan 21 00:58:43.930000 audit: BPF prog-id=158 op=LOAD Jan 21 00:58:43.930000 audit[3441]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=3429 pid=3441 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:43.930000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6666333130346134366632306664653534346432613039636430663261 Jan 21 00:58:43.930000 audit: BPF prog-id=159 op=LOAD Jan 21 00:58:43.930000 audit[3441]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=3429 pid=3441 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:43.930000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6666333130346134366632306664653534346432613039636430663261 Jan 21 00:58:43.930000 audit: BPF prog-id=159 op=UNLOAD Jan 21 00:58:43.930000 audit[3441]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3429 pid=3441 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:43.930000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6666333130346134366632306664653534346432613039636430663261 Jan 21 00:58:43.930000 audit: BPF prog-id=158 op=UNLOAD Jan 21 00:58:43.930000 audit[3441]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3429 pid=3441 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:43.930000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6666333130346134366632306664653534346432613039636430663261 Jan 21 00:58:43.930000 audit: BPF prog-id=160 op=LOAD Jan 21 00:58:43.930000 audit[3441]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=3429 pid=3441 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:43.930000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6666333130346134366632306664653534346432613039636430663261 Jan 21 00:58:43.938602 kubelet[2901]: E0121 00:58:43.938583 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:43.938602 kubelet[2901]: W0121 00:58:43.938599 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:43.938725 kubelet[2901]: E0121 00:58:43.938626 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:43.951551 containerd[1680]: time="2026-01-21T00:58:43.951510609Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-m2vxz,Uid:fff7668a-1e6d-4df8-8d2a-dc72b548d806,Namespace:calico-system,Attempt:0,} returns sandbox id \"ff3104a46f20fde544d2a09cd0f2ae92f4b166ddb5c3f82fc868ab034ff4f89f\"" Jan 21 00:58:44.338000 audit[3494]: NETFILTER_CFG table=filter:115 family=2 entries=22 op=nft_register_rule pid=3494 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 21 00:58:44.342019 kernel: kauditd_printk_skb: 69 callbacks suppressed Jan 21 00:58:44.342066 kernel: audit: type=1325 audit(1768957124.338:551): table=filter:115 family=2 entries=22 op=nft_register_rule pid=3494 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 21 00:58:44.338000 audit[3494]: SYSCALL arch=c000003e syscall=46 success=yes exit=8224 a0=3 a1=7ffe477a8600 a2=0 a3=7ffe477a85ec items=0 ppid=3056 pid=3494 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:44.346720 kernel: audit: type=1300 audit(1768957124.338:551): arch=c000003e syscall=46 success=yes exit=8224 a0=3 a1=7ffe477a8600 a2=0 a3=7ffe477a85ec items=0 ppid=3056 pid=3494 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:44.338000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 21 00:58:44.348000 audit[3494]: NETFILTER_CFG table=nat:116 family=2 entries=12 op=nft_register_rule pid=3494 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 21 00:58:44.353906 kernel: audit: type=1327 audit(1768957124.338:551): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 21 00:58:44.353959 kernel: audit: type=1325 audit(1768957124.348:552): table=nat:116 family=2 entries=12 op=nft_register_rule pid=3494 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 21 00:58:44.348000 audit[3494]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffe477a8600 a2=0 a3=0 items=0 ppid=3056 pid=3494 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:44.357289 kernel: audit: type=1300 audit(1768957124.348:552): arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffe477a8600 a2=0 a3=0 items=0 ppid=3056 pid=3494 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:44.348000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 21 00:58:44.361711 kernel: audit: type=1327 audit(1768957124.348:552): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 21 00:58:45.142562 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1999641730.mount: Deactivated successfully. Jan 21 00:58:45.720160 containerd[1680]: time="2026-01-21T00:58:45.720111085Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 21 00:58:45.722293 containerd[1680]: time="2026-01-21T00:58:45.722182019Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.4: active requests=0, bytes read=33735893" Jan 21 00:58:45.724175 containerd[1680]: time="2026-01-21T00:58:45.724093711Z" level=info msg="ImageCreate event name:\"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 21 00:58:45.726671 containerd[1680]: time="2026-01-21T00:58:45.726622557Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 21 00:58:45.727199 containerd[1680]: time="2026-01-21T00:58:45.727039789Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.4\" with image id \"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\", size \"35234482\" in 1.875698545s" Jan 21 00:58:45.727199 containerd[1680]: time="2026-01-21T00:58:45.727066172Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\" returns image reference \"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\"" Jan 21 00:58:45.728886 containerd[1680]: time="2026-01-21T00:58:45.728842155Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\"" Jan 21 00:58:45.742730 containerd[1680]: time="2026-01-21T00:58:45.742088561Z" level=info msg="CreateContainer within sandbox \"08134b04a5e404cc1545fd6445e4237aa643f82fdcb39b5cad17f1c72406f091\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Jan 21 00:58:45.753958 containerd[1680]: time="2026-01-21T00:58:45.753925396Z" level=info msg="Container 4851c5e909ff42b0ecb8f6848196a30834fe34f19690e454ca36bce2169d4771: CDI devices from CRI Config.CDIDevices: []" Jan 21 00:58:45.756266 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3596155589.mount: Deactivated successfully. Jan 21 00:58:45.765694 containerd[1680]: time="2026-01-21T00:58:45.765658275Z" level=info msg="CreateContainer within sandbox \"08134b04a5e404cc1545fd6445e4237aa643f82fdcb39b5cad17f1c72406f091\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"4851c5e909ff42b0ecb8f6848196a30834fe34f19690e454ca36bce2169d4771\"" Jan 21 00:58:45.766137 containerd[1680]: time="2026-01-21T00:58:45.766109451Z" level=info msg="StartContainer for \"4851c5e909ff42b0ecb8f6848196a30834fe34f19690e454ca36bce2169d4771\"" Jan 21 00:58:45.767189 containerd[1680]: time="2026-01-21T00:58:45.767167146Z" level=info msg="connecting to shim 4851c5e909ff42b0ecb8f6848196a30834fe34f19690e454ca36bce2169d4771" address="unix:///run/containerd/s/26a948851580aa90a986cf65617a1c6652ab652e4dbcb44ce3aaee62ce59c212" protocol=ttrpc version=3 Jan 21 00:58:45.789896 systemd[1]: Started cri-containerd-4851c5e909ff42b0ecb8f6848196a30834fe34f19690e454ca36bce2169d4771.scope - libcontainer container 4851c5e909ff42b0ecb8f6848196a30834fe34f19690e454ca36bce2169d4771. Jan 21 00:58:45.802000 audit: BPF prog-id=161 op=LOAD Jan 21 00:58:45.804000 audit: BPF prog-id=162 op=LOAD Jan 21 00:58:45.807195 kernel: audit: type=1334 audit(1768957125.802:553): prog-id=161 op=LOAD Jan 21 00:58:45.807227 kernel: audit: type=1334 audit(1768957125.804:554): prog-id=162 op=LOAD Jan 21 00:58:45.804000 audit[3505]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=3311 pid=3505 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:45.809954 kernel: audit: type=1300 audit(1768957125.804:554): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=3311 pid=3505 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:45.804000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3438353163356539303966663432623065636238663638343831393661 Jan 21 00:58:45.804000 audit: BPF prog-id=162 op=UNLOAD Jan 21 00:58:45.804000 audit[3505]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3311 pid=3505 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:45.804000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3438353163356539303966663432623065636238663638343831393661 Jan 21 00:58:45.817918 kernel: audit: type=1327 audit(1768957125.804:554): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3438353163356539303966663432623065636238663638343831393661 Jan 21 00:58:45.804000 audit: BPF prog-id=163 op=LOAD Jan 21 00:58:45.804000 audit[3505]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=3311 pid=3505 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:45.804000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3438353163356539303966663432623065636238663638343831393661 Jan 21 00:58:45.804000 audit: BPF prog-id=164 op=LOAD Jan 21 00:58:45.804000 audit[3505]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=3311 pid=3505 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:45.804000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3438353163356539303966663432623065636238663638343831393661 Jan 21 00:58:45.804000 audit: BPF prog-id=164 op=UNLOAD Jan 21 00:58:45.804000 audit[3505]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3311 pid=3505 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:45.804000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3438353163356539303966663432623065636238663638343831393661 Jan 21 00:58:45.804000 audit: BPF prog-id=163 op=UNLOAD Jan 21 00:58:45.804000 audit[3505]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3311 pid=3505 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:45.804000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3438353163356539303966663432623065636238663638343831393661 Jan 21 00:58:45.804000 audit: BPF prog-id=165 op=LOAD Jan 21 00:58:45.804000 audit[3505]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=3311 pid=3505 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:45.804000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3438353163356539303966663432623065636238663638343831393661 Jan 21 00:58:45.852862 containerd[1680]: time="2026-01-21T00:58:45.852822178Z" level=info msg="StartContainer for \"4851c5e909ff42b0ecb8f6848196a30834fe34f19690e454ca36bce2169d4771\" returns successfully" Jan 21 00:58:46.211124 kubelet[2901]: E0121 00:58:46.211045 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8lkb9" podUID="851d5829-334a-4f46-97de-87be973a0b77" Jan 21 00:58:46.277077 kubelet[2901]: I0121 00:58:46.276915 2901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-68dd87d7b4-bbf8k" podStartSLOduration=1.400043867 podStartE2EDuration="3.276740411s" podCreationTimestamp="2026-01-21 00:58:43 +0000 UTC" firstStartedPulling="2026-01-21 00:58:43.850972667 +0000 UTC m=+18.791588263" lastFinishedPulling="2026-01-21 00:58:45.727669209 +0000 UTC m=+20.668284807" observedRunningTime="2026-01-21 00:58:46.276506464 +0000 UTC m=+21.217122084" watchObservedRunningTime="2026-01-21 00:58:46.276740411 +0000 UTC m=+21.217356061" Jan 21 00:58:46.326056 kubelet[2901]: E0121 00:58:46.326029 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:46.326056 kubelet[2901]: W0121 00:58:46.326049 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:46.326234 kubelet[2901]: E0121 00:58:46.326069 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:46.326234 kubelet[2901]: E0121 00:58:46.326220 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:46.326234 kubelet[2901]: W0121 00:58:46.326227 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:46.326305 kubelet[2901]: E0121 00:58:46.326235 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:46.326389 kubelet[2901]: E0121 00:58:46.326374 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:46.326389 kubelet[2901]: W0121 00:58:46.326383 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:46.326432 kubelet[2901]: E0121 00:58:46.326390 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:46.326700 kubelet[2901]: E0121 00:58:46.326629 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:46.326700 kubelet[2901]: W0121 00:58:46.326663 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:46.326700 kubelet[2901]: E0121 00:58:46.326671 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:46.326859 kubelet[2901]: E0121 00:58:46.326835 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:46.326889 kubelet[2901]: W0121 00:58:46.326884 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:46.326914 kubelet[2901]: E0121 00:58:46.326890 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:46.327121 kubelet[2901]: E0121 00:58:46.327049 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:46.327121 kubelet[2901]: W0121 00:58:46.327058 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:46.327121 kubelet[2901]: E0121 00:58:46.327064 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:46.327587 kubelet[2901]: E0121 00:58:46.327463 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:46.327587 kubelet[2901]: W0121 00:58:46.327473 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:46.327587 kubelet[2901]: E0121 00:58:46.327482 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:46.327676 kubelet[2901]: E0121 00:58:46.327603 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:46.327676 kubelet[2901]: W0121 00:58:46.327608 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:46.327676 kubelet[2901]: E0121 00:58:46.327614 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:46.327984 kubelet[2901]: E0121 00:58:46.327754 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:46.327984 kubelet[2901]: W0121 00:58:46.327759 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:46.327984 kubelet[2901]: E0121 00:58:46.327774 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:46.327984 kubelet[2901]: E0121 00:58:46.327918 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:46.327984 kubelet[2901]: W0121 00:58:46.327925 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:46.327984 kubelet[2901]: E0121 00:58:46.327930 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:46.328212 kubelet[2901]: E0121 00:58:46.328129 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:46.328212 kubelet[2901]: W0121 00:58:46.328135 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:46.328212 kubelet[2901]: E0121 00:58:46.328142 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:46.328456 kubelet[2901]: E0121 00:58:46.328446 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:46.328480 kubelet[2901]: W0121 00:58:46.328457 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:46.328480 kubelet[2901]: E0121 00:58:46.328474 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:46.328657 kubelet[2901]: E0121 00:58:46.328649 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:46.328657 kubelet[2901]: W0121 00:58:46.328657 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:46.328710 kubelet[2901]: E0121 00:58:46.328664 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:46.329305 kubelet[2901]: E0121 00:58:46.329198 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:46.329305 kubelet[2901]: W0121 00:58:46.329209 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:46.329305 kubelet[2901]: E0121 00:58:46.329220 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:46.329458 kubelet[2901]: E0121 00:58:46.329452 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:46.329494 kubelet[2901]: W0121 00:58:46.329489 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:46.329527 kubelet[2901]: E0121 00:58:46.329521 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:46.344214 kubelet[2901]: E0121 00:58:46.344155 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:46.344580 kubelet[2901]: W0121 00:58:46.344183 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:46.344580 kubelet[2901]: E0121 00:58:46.344299 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:46.344863 kubelet[2901]: E0121 00:58:46.344853 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:46.345003 kubelet[2901]: W0121 00:58:46.344899 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:46.345003 kubelet[2901]: E0121 00:58:46.344921 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:46.345128 kubelet[2901]: E0121 00:58:46.345122 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:46.345164 kubelet[2901]: W0121 00:58:46.345158 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:46.345212 kubelet[2901]: E0121 00:58:46.345202 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:46.345596 kubelet[2901]: E0121 00:58:46.345562 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:46.345596 kubelet[2901]: W0121 00:58:46.345586 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:46.345769 kubelet[2901]: E0121 00:58:46.345627 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:46.346270 kubelet[2901]: E0121 00:58:46.346254 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:46.346270 kubelet[2901]: W0121 00:58:46.346266 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:46.346338 kubelet[2901]: E0121 00:58:46.346282 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:46.346705 kubelet[2901]: E0121 00:58:46.346539 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:46.346705 kubelet[2901]: W0121 00:58:46.346551 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:46.346705 kubelet[2901]: E0121 00:58:46.346562 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:46.347331 kubelet[2901]: E0121 00:58:46.347224 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:46.347331 kubelet[2901]: W0121 00:58:46.347315 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:46.347539 kubelet[2901]: E0121 00:58:46.347489 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:46.348222 kubelet[2901]: E0121 00:58:46.348129 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:46.348222 kubelet[2901]: W0121 00:58:46.348141 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:46.348222 kubelet[2901]: E0121 00:58:46.348155 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:46.348418 kubelet[2901]: E0121 00:58:46.348347 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:46.348418 kubelet[2901]: W0121 00:58:46.348361 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:46.348418 kubelet[2901]: E0121 00:58:46.348372 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:46.348586 kubelet[2901]: E0121 00:58:46.348488 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:46.348586 kubelet[2901]: W0121 00:58:46.348497 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:46.348586 kubelet[2901]: E0121 00:58:46.348504 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:46.348860 kubelet[2901]: E0121 00:58:46.348612 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:46.348860 kubelet[2901]: W0121 00:58:46.348618 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:46.348860 kubelet[2901]: E0121 00:58:46.348624 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:46.348860 kubelet[2901]: E0121 00:58:46.348771 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:46.348860 kubelet[2901]: W0121 00:58:46.348776 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:46.348860 kubelet[2901]: E0121 00:58:46.348783 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:46.349283 kubelet[2901]: E0121 00:58:46.349269 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:46.349283 kubelet[2901]: W0121 00:58:46.349281 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:46.349458 kubelet[2901]: E0121 00:58:46.349331 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:46.349458 kubelet[2901]: E0121 00:58:46.349404 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:46.349458 kubelet[2901]: W0121 00:58:46.349410 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:46.349458 kubelet[2901]: E0121 00:58:46.349424 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:46.349635 kubelet[2901]: E0121 00:58:46.349519 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:46.349635 kubelet[2901]: W0121 00:58:46.349525 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:46.349635 kubelet[2901]: E0121 00:58:46.349531 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:46.349915 kubelet[2901]: E0121 00:58:46.349648 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:46.349915 kubelet[2901]: W0121 00:58:46.349653 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:46.349915 kubelet[2901]: E0121 00:58:46.349660 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:46.349915 kubelet[2901]: E0121 00:58:46.349908 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:46.349915 kubelet[2901]: W0121 00:58:46.349914 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:46.350280 kubelet[2901]: E0121 00:58:46.350013 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:46.350280 kubelet[2901]: E0121 00:58:46.350017 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 21 00:58:46.350280 kubelet[2901]: W0121 00:58:46.350050 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 21 00:58:46.350280 kubelet[2901]: E0121 00:58:46.350060 2901 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 21 00:58:46.981966 containerd[1680]: time="2026-01-21T00:58:46.981898412Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 21 00:58:46.983859 containerd[1680]: time="2026-01-21T00:58:46.983820703Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4: active requests=0, bytes read=0" Jan 21 00:58:46.985784 containerd[1680]: time="2026-01-21T00:58:46.985740169Z" level=info msg="ImageCreate event name:\"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 21 00:58:46.989227 containerd[1680]: time="2026-01-21T00:58:46.988751486Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 21 00:58:46.989227 containerd[1680]: time="2026-01-21T00:58:46.989101188Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" with image id \"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\", size \"5941314\" in 1.260199831s" Jan 21 00:58:46.989227 containerd[1680]: time="2026-01-21T00:58:46.989149413Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" returns image reference \"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\"" Jan 21 00:58:46.990955 containerd[1680]: time="2026-01-21T00:58:46.990856004Z" level=info msg="CreateContainer within sandbox \"ff3104a46f20fde544d2a09cd0f2ae92f4b166ddb5c3f82fc868ab034ff4f89f\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Jan 21 00:58:47.004467 containerd[1680]: time="2026-01-21T00:58:47.004443038Z" level=info msg="Container 35325ee943a90ac07f525e83e88e427738b5958e060edbfb269bbf705b16959a: CDI devices from CRI Config.CDIDevices: []" Jan 21 00:58:47.015324 containerd[1680]: time="2026-01-21T00:58:47.015277535Z" level=info msg="CreateContainer within sandbox \"ff3104a46f20fde544d2a09cd0f2ae92f4b166ddb5c3f82fc868ab034ff4f89f\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"35325ee943a90ac07f525e83e88e427738b5958e060edbfb269bbf705b16959a\"" Jan 21 00:58:47.016791 containerd[1680]: time="2026-01-21T00:58:47.016770408Z" level=info msg="StartContainer for \"35325ee943a90ac07f525e83e88e427738b5958e060edbfb269bbf705b16959a\"" Jan 21 00:58:47.018204 containerd[1680]: time="2026-01-21T00:58:47.018166766Z" level=info msg="connecting to shim 35325ee943a90ac07f525e83e88e427738b5958e060edbfb269bbf705b16959a" address="unix:///run/containerd/s/5d5ee1b77ec652855177580ca59af291888096523a75d311b38a2ae908263b56" protocol=ttrpc version=3 Jan 21 00:58:47.042949 systemd[1]: Started cri-containerd-35325ee943a90ac07f525e83e88e427738b5958e060edbfb269bbf705b16959a.scope - libcontainer container 35325ee943a90ac07f525e83e88e427738b5958e060edbfb269bbf705b16959a. Jan 21 00:58:47.099000 audit: BPF prog-id=166 op=LOAD Jan 21 00:58:47.099000 audit[3580]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000220488 a2=98 a3=0 items=0 ppid=3429 pid=3580 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:47.099000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3335333235656539343361393061633037663532356538336538386534 Jan 21 00:58:47.099000 audit: BPF prog-id=167 op=LOAD Jan 21 00:58:47.099000 audit[3580]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000220218 a2=98 a3=0 items=0 ppid=3429 pid=3580 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:47.099000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3335333235656539343361393061633037663532356538336538386534 Jan 21 00:58:47.099000 audit: BPF prog-id=167 op=UNLOAD Jan 21 00:58:47.099000 audit[3580]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3429 pid=3580 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:47.099000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3335333235656539343361393061633037663532356538336538386534 Jan 21 00:58:47.099000 audit: BPF prog-id=166 op=UNLOAD Jan 21 00:58:47.099000 audit[3580]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3429 pid=3580 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:47.099000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3335333235656539343361393061633037663532356538336538386534 Jan 21 00:58:47.099000 audit: BPF prog-id=168 op=LOAD Jan 21 00:58:47.099000 audit[3580]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0002206e8 a2=98 a3=0 items=0 ppid=3429 pid=3580 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:47.099000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3335333235656539343361393061633037663532356538336538386534 Jan 21 00:58:47.118809 containerd[1680]: time="2026-01-21T00:58:47.118720857Z" level=info msg="StartContainer for \"35325ee943a90ac07f525e83e88e427738b5958e060edbfb269bbf705b16959a\" returns successfully" Jan 21 00:58:47.126808 systemd[1]: cri-containerd-35325ee943a90ac07f525e83e88e427738b5958e060edbfb269bbf705b16959a.scope: Deactivated successfully. Jan 21 00:58:47.128000 audit: BPF prog-id=168 op=UNLOAD Jan 21 00:58:47.130699 containerd[1680]: time="2026-01-21T00:58:47.130635062Z" level=info msg="received container exit event container_id:\"35325ee943a90ac07f525e83e88e427738b5958e060edbfb269bbf705b16959a\" id:\"35325ee943a90ac07f525e83e88e427738b5958e060edbfb269bbf705b16959a\" pid:3593 exited_at:{seconds:1768957127 nanos:130328388}" Jan 21 00:58:47.152745 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-35325ee943a90ac07f525e83e88e427738b5958e060edbfb269bbf705b16959a-rootfs.mount: Deactivated successfully. Jan 21 00:58:47.269828 kubelet[2901]: I0121 00:58:47.268750 2901 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 21 00:58:48.210406 kubelet[2901]: E0121 00:58:48.210347 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8lkb9" podUID="851d5829-334a-4f46-97de-87be973a0b77" Jan 21 00:58:49.275702 containerd[1680]: time="2026-01-21T00:58:49.275526609Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\"" Jan 21 00:58:50.210872 kubelet[2901]: E0121 00:58:50.210808 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8lkb9" podUID="851d5829-334a-4f46-97de-87be973a0b77" Jan 21 00:58:51.851196 containerd[1680]: time="2026-01-21T00:58:51.850643764Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 21 00:58:51.852134 containerd[1680]: time="2026-01-21T00:58:51.852094407Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.4: active requests=0, bytes read=70442291" Jan 21 00:58:51.853713 containerd[1680]: time="2026-01-21T00:58:51.853676120Z" level=info msg="ImageCreate event name:\"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 21 00:58:51.856241 containerd[1680]: time="2026-01-21T00:58:51.856186153Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 21 00:58:51.856896 containerd[1680]: time="2026-01-21T00:58:51.856751548Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.4\" with image id \"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\", size \"71941459\" in 2.581187681s" Jan 21 00:58:51.856896 containerd[1680]: time="2026-01-21T00:58:51.856784830Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\" returns image reference \"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\"" Jan 21 00:58:51.860653 containerd[1680]: time="2026-01-21T00:58:51.860607719Z" level=info msg="CreateContainer within sandbox \"ff3104a46f20fde544d2a09cd0f2ae92f4b166ddb5c3f82fc868ab034ff4f89f\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Jan 21 00:58:51.875735 containerd[1680]: time="2026-01-21T00:58:51.872404255Z" level=info msg="Container 1a21e8a40e35725eaddd38259ebcbf5e923cc76a30e27f1ab4a305d199217880: CDI devices from CRI Config.CDIDevices: []" Jan 21 00:58:51.885975 containerd[1680]: time="2026-01-21T00:58:51.885927427Z" level=info msg="CreateContainer within sandbox \"ff3104a46f20fde544d2a09cd0f2ae92f4b166ddb5c3f82fc868ab034ff4f89f\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"1a21e8a40e35725eaddd38259ebcbf5e923cc76a30e27f1ab4a305d199217880\"" Jan 21 00:58:51.886773 containerd[1680]: time="2026-01-21T00:58:51.886615031Z" level=info msg="StartContainer for \"1a21e8a40e35725eaddd38259ebcbf5e923cc76a30e27f1ab4a305d199217880\"" Jan 21 00:58:51.889600 containerd[1680]: time="2026-01-21T00:58:51.889568868Z" level=info msg="connecting to shim 1a21e8a40e35725eaddd38259ebcbf5e923cc76a30e27f1ab4a305d199217880" address="unix:///run/containerd/s/5d5ee1b77ec652855177580ca59af291888096523a75d311b38a2ae908263b56" protocol=ttrpc version=3 Jan 21 00:58:51.916020 systemd[1]: Started cri-containerd-1a21e8a40e35725eaddd38259ebcbf5e923cc76a30e27f1ab4a305d199217880.scope - libcontainer container 1a21e8a40e35725eaddd38259ebcbf5e923cc76a30e27f1ab4a305d199217880. Jan 21 00:58:51.977000 audit: BPF prog-id=169 op=LOAD Jan 21 00:58:51.979826 kernel: kauditd_printk_skb: 34 callbacks suppressed Jan 21 00:58:51.979915 kernel: audit: type=1334 audit(1768957131.977:567): prog-id=169 op=LOAD Jan 21 00:58:51.977000 audit[3640]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001a0488 a2=98 a3=0 items=0 ppid=3429 pid=3640 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:51.983701 kernel: audit: type=1300 audit(1768957131.977:567): arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001a0488 a2=98 a3=0 items=0 ppid=3429 pid=3640 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:51.977000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3161323165386134306533353732356561646464333832353965626362 Jan 21 00:58:51.988430 kernel: audit: type=1327 audit(1768957131.977:567): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3161323165386134306533353732356561646464333832353965626362 Jan 21 00:58:51.977000 audit: BPF prog-id=170 op=LOAD Jan 21 00:58:51.977000 audit[3640]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c0001a0218 a2=98 a3=0 items=0 ppid=3429 pid=3640 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:51.996201 kernel: audit: type=1334 audit(1768957131.977:568): prog-id=170 op=LOAD Jan 21 00:58:51.996289 kernel: audit: type=1300 audit(1768957131.977:568): arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c0001a0218 a2=98 a3=0 items=0 ppid=3429 pid=3640 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:51.977000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3161323165386134306533353732356561646464333832353965626362 Jan 21 00:58:52.000902 kernel: audit: type=1327 audit(1768957131.977:568): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3161323165386134306533353732356561646464333832353965626362 Jan 21 00:58:51.977000 audit: BPF prog-id=170 op=UNLOAD Jan 21 00:58:52.003970 kernel: audit: type=1334 audit(1768957131.977:569): prog-id=170 op=UNLOAD Jan 21 00:58:51.977000 audit[3640]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3429 pid=3640 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:52.013727 kernel: audit: type=1300 audit(1768957131.977:569): arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3429 pid=3640 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:51.977000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3161323165386134306533353732356561646464333832353965626362 Jan 21 00:58:52.019703 kernel: audit: type=1327 audit(1768957131.977:569): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3161323165386134306533353732356561646464333832353965626362 Jan 21 00:58:52.023733 kernel: audit: type=1334 audit(1768957131.977:570): prog-id=169 op=UNLOAD Jan 21 00:58:51.977000 audit: BPF prog-id=169 op=UNLOAD Jan 21 00:58:51.977000 audit[3640]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3429 pid=3640 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:51.977000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3161323165386134306533353732356561646464333832353965626362 Jan 21 00:58:51.977000 audit: BPF prog-id=171 op=LOAD Jan 21 00:58:51.977000 audit[3640]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001a06e8 a2=98 a3=0 items=0 ppid=3429 pid=3640 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:58:51.977000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3161323165386134306533353732356561646464333832353965626362 Jan 21 00:58:52.031482 containerd[1680]: time="2026-01-21T00:58:52.031440482Z" level=info msg="StartContainer for \"1a21e8a40e35725eaddd38259ebcbf5e923cc76a30e27f1ab4a305d199217880\" returns successfully" Jan 21 00:58:52.211015 kubelet[2901]: E0121 00:58:52.210968 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8lkb9" podUID="851d5829-334a-4f46-97de-87be973a0b77" Jan 21 00:58:53.341642 containerd[1680]: time="2026-01-21T00:58:53.341585557Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jan 21 00:58:53.344569 systemd[1]: cri-containerd-1a21e8a40e35725eaddd38259ebcbf5e923cc76a30e27f1ab4a305d199217880.scope: Deactivated successfully. Jan 21 00:58:53.345310 systemd[1]: cri-containerd-1a21e8a40e35725eaddd38259ebcbf5e923cc76a30e27f1ab4a305d199217880.scope: Consumed 464ms CPU time, 196.1M memory peak, 171.3M written to disk. Jan 21 00:58:53.346000 audit: BPF prog-id=171 op=UNLOAD Jan 21 00:58:53.348081 containerd[1680]: time="2026-01-21T00:58:53.348038192Z" level=info msg="received container exit event container_id:\"1a21e8a40e35725eaddd38259ebcbf5e923cc76a30e27f1ab4a305d199217880\" id:\"1a21e8a40e35725eaddd38259ebcbf5e923cc76a30e27f1ab4a305d199217880\" pid:3653 exited_at:{seconds:1768957133 nanos:346571274}" Jan 21 00:58:53.368295 kubelet[2901]: I0121 00:58:53.368263 2901 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Jan 21 00:58:53.378306 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-1a21e8a40e35725eaddd38259ebcbf5e923cc76a30e27f1ab4a305d199217880-rootfs.mount: Deactivated successfully. Jan 21 00:58:53.425444 systemd[1]: Created slice kubepods-burstable-pod505024bb_cdb3_4e80_9c79_74499326a9e9.slice - libcontainer container kubepods-burstable-pod505024bb_cdb3_4e80_9c79_74499326a9e9.slice. Jan 21 00:58:53.433337 systemd[1]: Created slice kubepods-besteffort-pod22543a80_9d55_4110_b0da_aa35bd7688e7.slice - libcontainer container kubepods-besteffort-pod22543a80_9d55_4110_b0da_aa35bd7688e7.slice. Jan 21 00:58:54.047170 kubelet[2901]: I0121 00:58:53.498459 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/c4ce6349-e9dc-4bcd-9543-fc9564c69c1a-whisker-backend-key-pair\") pod \"whisker-787f4b58dc-gdnrh\" (UID: \"c4ce6349-e9dc-4bcd-9543-fc9564c69c1a\") " pod="calico-system/whisker-787f4b58dc-gdnrh" Jan 21 00:58:54.047170 kubelet[2901]: I0121 00:58:53.498497 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6z62\" (UniqueName: \"kubernetes.io/projected/59ce7fac-1a6e-4ec4-b99e-063ed3e3444c-kube-api-access-q6z62\") pod \"calico-kube-controllers-bc7877fd9-wf2hg\" (UID: \"59ce7fac-1a6e-4ec4-b99e-063ed3e3444c\") " pod="calico-system/calico-kube-controllers-bc7877fd9-wf2hg" Jan 21 00:58:54.047170 kubelet[2901]: I0121 00:58:53.498518 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/22543a80-9d55-4110-b0da-aa35bd7688e7-calico-apiserver-certs\") pod \"calico-apiserver-54cbb8896b-hxx9z\" (UID: \"22543a80-9d55-4110-b0da-aa35bd7688e7\") " pod="calico-apiserver/calico-apiserver-54cbb8896b-hxx9z" Jan 21 00:58:54.047170 kubelet[2901]: I0121 00:58:53.498537 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/557eba89-d604-4304-afae-f0e623ef8722-calico-apiserver-certs\") pod \"calico-apiserver-54cbb8896b-89fcv\" (UID: \"557eba89-d604-4304-afae-f0e623ef8722\") " pod="calico-apiserver/calico-apiserver-54cbb8896b-89fcv" Jan 21 00:58:54.047170 kubelet[2901]: I0121 00:58:53.498554 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/505024bb-cdb3-4e80-9c79-74499326a9e9-config-volume\") pod \"coredns-668d6bf9bc-gqzzb\" (UID: \"505024bb-cdb3-4e80-9c79-74499326a9e9\") " pod="kube-system/coredns-668d6bf9bc-gqzzb" Jan 21 00:58:53.440326 systemd[1]: Created slice kubepods-besteffort-pod59ce7fac_1a6e_4ec4_b99e_063ed3e3444c.slice - libcontainer container kubepods-besteffort-pod59ce7fac_1a6e_4ec4_b99e_063ed3e3444c.slice. Jan 21 00:58:54.047389 kubelet[2901]: I0121 00:58:53.498571 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brvdc\" (UniqueName: \"kubernetes.io/projected/c4ce6349-e9dc-4bcd-9543-fc9564c69c1a-kube-api-access-brvdc\") pod \"whisker-787f4b58dc-gdnrh\" (UID: \"c4ce6349-e9dc-4bcd-9543-fc9564c69c1a\") " pod="calico-system/whisker-787f4b58dc-gdnrh" Jan 21 00:58:54.047389 kubelet[2901]: I0121 00:58:53.498587 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/59ce7fac-1a6e-4ec4-b99e-063ed3e3444c-tigera-ca-bundle\") pod \"calico-kube-controllers-bc7877fd9-wf2hg\" (UID: \"59ce7fac-1a6e-4ec4-b99e-063ed3e3444c\") " pod="calico-system/calico-kube-controllers-bc7877fd9-wf2hg" Jan 21 00:58:54.047389 kubelet[2901]: I0121 00:58:53.498609 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dbdq\" (UniqueName: \"kubernetes.io/projected/22543a80-9d55-4110-b0da-aa35bd7688e7-kube-api-access-9dbdq\") pod \"calico-apiserver-54cbb8896b-hxx9z\" (UID: \"22543a80-9d55-4110-b0da-aa35bd7688e7\") " pod="calico-apiserver/calico-apiserver-54cbb8896b-hxx9z" Jan 21 00:58:54.047389 kubelet[2901]: I0121 00:58:53.498624 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be64252f-80c4-46f6-a3d4-52a6471b1a63-config\") pod \"goldmane-666569f655-95c8v\" (UID: \"be64252f-80c4-46f6-a3d4-52a6471b1a63\") " pod="calico-system/goldmane-666569f655-95c8v" Jan 21 00:58:54.047389 kubelet[2901]: I0121 00:58:53.498643 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9663badd-5b67-43ba-94ba-c1609d9bd7c0-config-volume\") pod \"coredns-668d6bf9bc-78b44\" (UID: \"9663badd-5b67-43ba-94ba-c1609d9bd7c0\") " pod="kube-system/coredns-668d6bf9bc-78b44" Jan 21 00:58:53.450582 systemd[1]: Created slice kubepods-besteffort-podc4ce6349_e9dc_4bcd_9543_fc9564c69c1a.slice - libcontainer container kubepods-besteffort-podc4ce6349_e9dc_4bcd_9543_fc9564c69c1a.slice. Jan 21 00:58:54.047535 kubelet[2901]: I0121 00:58:53.498658 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/be64252f-80c4-46f6-a3d4-52a6471b1a63-goldmane-key-pair\") pod \"goldmane-666569f655-95c8v\" (UID: \"be64252f-80c4-46f6-a3d4-52a6471b1a63\") " pod="calico-system/goldmane-666569f655-95c8v" Jan 21 00:58:54.047535 kubelet[2901]: I0121 00:58:53.498673 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grgbg\" (UniqueName: \"kubernetes.io/projected/be64252f-80c4-46f6-a3d4-52a6471b1a63-kube-api-access-grgbg\") pod \"goldmane-666569f655-95c8v\" (UID: \"be64252f-80c4-46f6-a3d4-52a6471b1a63\") " pod="calico-system/goldmane-666569f655-95c8v" Jan 21 00:58:54.047535 kubelet[2901]: I0121 00:58:53.499255 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7xs8\" (UniqueName: \"kubernetes.io/projected/9663badd-5b67-43ba-94ba-c1609d9bd7c0-kube-api-access-n7xs8\") pod \"coredns-668d6bf9bc-78b44\" (UID: \"9663badd-5b67-43ba-94ba-c1609d9bd7c0\") " pod="kube-system/coredns-668d6bf9bc-78b44" Jan 21 00:58:54.047535 kubelet[2901]: I0121 00:58:53.499277 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be64252f-80c4-46f6-a3d4-52a6471b1a63-goldmane-ca-bundle\") pod \"goldmane-666569f655-95c8v\" (UID: \"be64252f-80c4-46f6-a3d4-52a6471b1a63\") " pod="calico-system/goldmane-666569f655-95c8v" Jan 21 00:58:54.047535 kubelet[2901]: I0121 00:58:53.499297 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9j29j\" (UniqueName: \"kubernetes.io/projected/505024bb-cdb3-4e80-9c79-74499326a9e9-kube-api-access-9j29j\") pod \"coredns-668d6bf9bc-gqzzb\" (UID: \"505024bb-cdb3-4e80-9c79-74499326a9e9\") " pod="kube-system/coredns-668d6bf9bc-gqzzb" Jan 21 00:58:53.458882 systemd[1]: Created slice kubepods-besteffort-pod557eba89_d604_4304_afae_f0e623ef8722.slice - libcontainer container kubepods-besteffort-pod557eba89_d604_4304_afae_f0e623ef8722.slice. Jan 21 00:58:54.047700 kubelet[2901]: I0121 00:58:53.499425 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4ce6349-e9dc-4bcd-9543-fc9564c69c1a-whisker-ca-bundle\") pod \"whisker-787f4b58dc-gdnrh\" (UID: \"c4ce6349-e9dc-4bcd-9543-fc9564c69c1a\") " pod="calico-system/whisker-787f4b58dc-gdnrh" Jan 21 00:58:54.047700 kubelet[2901]: I0121 00:58:53.499443 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hxfm\" (UniqueName: \"kubernetes.io/projected/557eba89-d604-4304-afae-f0e623ef8722-kube-api-access-9hxfm\") pod \"calico-apiserver-54cbb8896b-89fcv\" (UID: \"557eba89-d604-4304-afae-f0e623ef8722\") " pod="calico-apiserver/calico-apiserver-54cbb8896b-89fcv" Jan 21 00:58:53.473564 systemd[1]: Created slice kubepods-burstable-pod9663badd_5b67_43ba_94ba_c1609d9bd7c0.slice - libcontainer container kubepods-burstable-pod9663badd_5b67_43ba_94ba_c1609d9bd7c0.slice. Jan 21 00:58:53.488503 systemd[1]: Created slice kubepods-besteffort-podbe64252f_80c4_46f6_a3d4_52a6471b1a63.slice - libcontainer container kubepods-besteffort-podbe64252f_80c4_46f6_a3d4_52a6471b1a63.slice. Jan 21 00:58:54.082526 containerd[1680]: time="2026-01-21T00:58:54.077223551Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-78b44,Uid:9663badd-5b67-43ba-94ba-c1609d9bd7c0,Namespace:kube-system,Attempt:0,}" Jan 21 00:58:54.216983 systemd[1]: Created slice kubepods-besteffort-pod851d5829_334a_4f46_97de_87be973a0b77.slice - libcontainer container kubepods-besteffort-pod851d5829_334a_4f46_97de_87be973a0b77.slice. Jan 21 00:58:54.219349 containerd[1680]: time="2026-01-21T00:58:54.219314296Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-8lkb9,Uid:851d5829-334a-4f46-97de-87be973a0b77,Namespace:calico-system,Attempt:0,}" Jan 21 00:58:54.350852 containerd[1680]: time="2026-01-21T00:58:54.350629561Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-54cbb8896b-hxx9z,Uid:22543a80-9d55-4110-b0da-aa35bd7688e7,Namespace:calico-apiserver,Attempt:0,}" Jan 21 00:58:54.350852 containerd[1680]: time="2026-01-21T00:58:54.350658645Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-bc7877fd9-wf2hg,Uid:59ce7fac-1a6e-4ec4-b99e-063ed3e3444c,Namespace:calico-system,Attempt:0,}" Jan 21 00:58:54.350852 containerd[1680]: time="2026-01-21T00:58:54.350714057Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-gqzzb,Uid:505024bb-cdb3-4e80-9c79-74499326a9e9,Namespace:kube-system,Attempt:0,}" Jan 21 00:58:54.361746 containerd[1680]: time="2026-01-21T00:58:54.361651513Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-787f4b58dc-gdnrh,Uid:c4ce6349-e9dc-4bcd-9543-fc9564c69c1a,Namespace:calico-system,Attempt:0,}" Jan 21 00:58:54.365462 containerd[1680]: time="2026-01-21T00:58:54.365418701Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-54cbb8896b-89fcv,Uid:557eba89-d604-4304-afae-f0e623ef8722,Namespace:calico-apiserver,Attempt:0,}" Jan 21 00:58:54.365462 containerd[1680]: time="2026-01-21T00:58:54.365436535Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-95c8v,Uid:be64252f-80c4-46f6-a3d4-52a6471b1a63,Namespace:calico-system,Attempt:0,}" Jan 21 00:58:55.012593 containerd[1680]: time="2026-01-21T00:58:55.012503659Z" level=error msg="Failed to destroy network for sandbox \"68e622a2a4c9fb0a7606bfb724f2068278d6b5df1b3de5111f409b3cce8b577b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 21 00:58:55.016051 containerd[1680]: time="2026-01-21T00:58:55.015877943Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-78b44,Uid:9663badd-5b67-43ba-94ba-c1609d9bd7c0,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"68e622a2a4c9fb0a7606bfb724f2068278d6b5df1b3de5111f409b3cce8b577b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 21 00:58:55.016187 kubelet[2901]: E0121 00:58:55.016082 2901 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"68e622a2a4c9fb0a7606bfb724f2068278d6b5df1b3de5111f409b3cce8b577b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 21 00:58:55.016187 kubelet[2901]: E0121 00:58:55.016147 2901 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"68e622a2a4c9fb0a7606bfb724f2068278d6b5df1b3de5111f409b3cce8b577b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-78b44" Jan 21 00:58:55.016187 kubelet[2901]: E0121 00:58:55.016168 2901 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"68e622a2a4c9fb0a7606bfb724f2068278d6b5df1b3de5111f409b3cce8b577b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-78b44" Jan 21 00:58:55.016466 kubelet[2901]: E0121 00:58:55.016208 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-78b44_kube-system(9663badd-5b67-43ba-94ba-c1609d9bd7c0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-78b44_kube-system(9663badd-5b67-43ba-94ba-c1609d9bd7c0)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"68e622a2a4c9fb0a7606bfb724f2068278d6b5df1b3de5111f409b3cce8b577b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-78b44" podUID="9663badd-5b67-43ba-94ba-c1609d9bd7c0" Jan 21 00:58:55.066009 containerd[1680]: time="2026-01-21T00:58:55.065967321Z" level=error msg="Failed to destroy network for sandbox \"a7d225568ac01c2940f5e71992c44efc2a43cf0119f2a685fbbdd53c91084725\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 21 00:58:55.071027 containerd[1680]: time="2026-01-21T00:58:55.070965204Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-54cbb8896b-hxx9z,Uid:22543a80-9d55-4110-b0da-aa35bd7688e7,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"a7d225568ac01c2940f5e71992c44efc2a43cf0119f2a685fbbdd53c91084725\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 21 00:58:55.072020 kubelet[2901]: E0121 00:58:55.071183 2901 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a7d225568ac01c2940f5e71992c44efc2a43cf0119f2a685fbbdd53c91084725\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 21 00:58:55.072020 kubelet[2901]: E0121 00:58:55.071232 2901 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a7d225568ac01c2940f5e71992c44efc2a43cf0119f2a685fbbdd53c91084725\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-54cbb8896b-hxx9z" Jan 21 00:58:55.072020 kubelet[2901]: E0121 00:58:55.071256 2901 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a7d225568ac01c2940f5e71992c44efc2a43cf0119f2a685fbbdd53c91084725\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-54cbb8896b-hxx9z" Jan 21 00:58:55.072152 kubelet[2901]: E0121 00:58:55.071292 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-54cbb8896b-hxx9z_calico-apiserver(22543a80-9d55-4110-b0da-aa35bd7688e7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-54cbb8896b-hxx9z_calico-apiserver(22543a80-9d55-4110-b0da-aa35bd7688e7)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a7d225568ac01c2940f5e71992c44efc2a43cf0119f2a685fbbdd53c91084725\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-54cbb8896b-hxx9z" podUID="22543a80-9d55-4110-b0da-aa35bd7688e7" Jan 21 00:58:55.076671 containerd[1680]: time="2026-01-21T00:58:55.076631624Z" level=error msg="Failed to destroy network for sandbox \"f527ec19acf4feb8ba3bb5b65ac5c935d02f2f04dddac53abb727fc2e42f6001\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 21 00:58:55.081599 containerd[1680]: time="2026-01-21T00:58:55.081537963Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-54cbb8896b-89fcv,Uid:557eba89-d604-4304-afae-f0e623ef8722,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f527ec19acf4feb8ba3bb5b65ac5c935d02f2f04dddac53abb727fc2e42f6001\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 21 00:58:55.083626 kubelet[2901]: E0121 00:58:55.081763 2901 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f527ec19acf4feb8ba3bb5b65ac5c935d02f2f04dddac53abb727fc2e42f6001\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 21 00:58:55.083626 kubelet[2901]: E0121 00:58:55.081817 2901 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f527ec19acf4feb8ba3bb5b65ac5c935d02f2f04dddac53abb727fc2e42f6001\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-54cbb8896b-89fcv" Jan 21 00:58:55.083626 kubelet[2901]: E0121 00:58:55.081836 2901 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f527ec19acf4feb8ba3bb5b65ac5c935d02f2f04dddac53abb727fc2e42f6001\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-54cbb8896b-89fcv" Jan 21 00:58:55.083842 kubelet[2901]: E0121 00:58:55.081870 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-54cbb8896b-89fcv_calico-apiserver(557eba89-d604-4304-afae-f0e623ef8722)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-54cbb8896b-89fcv_calico-apiserver(557eba89-d604-4304-afae-f0e623ef8722)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f527ec19acf4feb8ba3bb5b65ac5c935d02f2f04dddac53abb727fc2e42f6001\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-54cbb8896b-89fcv" podUID="557eba89-d604-4304-afae-f0e623ef8722" Jan 21 00:58:55.087473 containerd[1680]: time="2026-01-21T00:58:55.087437389Z" level=error msg="Failed to destroy network for sandbox \"42f887d6d41bda5527b31c9b81d1f52287f551d371c3ce7229d904e9797cd3fa\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 21 00:58:55.095810 containerd[1680]: time="2026-01-21T00:58:55.095765881Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-gqzzb,Uid:505024bb-cdb3-4e80-9c79-74499326a9e9,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"42f887d6d41bda5527b31c9b81d1f52287f551d371c3ce7229d904e9797cd3fa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 21 00:58:55.096206 kubelet[2901]: E0121 00:58:55.096159 2901 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"42f887d6d41bda5527b31c9b81d1f52287f551d371c3ce7229d904e9797cd3fa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 21 00:58:55.096269 kubelet[2901]: E0121 00:58:55.096226 2901 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"42f887d6d41bda5527b31c9b81d1f52287f551d371c3ce7229d904e9797cd3fa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-gqzzb" Jan 21 00:58:55.096269 kubelet[2901]: E0121 00:58:55.096248 2901 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"42f887d6d41bda5527b31c9b81d1f52287f551d371c3ce7229d904e9797cd3fa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-gqzzb" Jan 21 00:58:55.096323 kubelet[2901]: E0121 00:58:55.096287 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-gqzzb_kube-system(505024bb-cdb3-4e80-9c79-74499326a9e9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-gqzzb_kube-system(505024bb-cdb3-4e80-9c79-74499326a9e9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"42f887d6d41bda5527b31c9b81d1f52287f551d371c3ce7229d904e9797cd3fa\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-gqzzb" podUID="505024bb-cdb3-4e80-9c79-74499326a9e9" Jan 21 00:58:55.106488 containerd[1680]: time="2026-01-21T00:58:55.106445249Z" level=error msg="Failed to destroy network for sandbox \"542495fef49f25fa7997d3b2a19ae7c9930542371551123a1dfc7868ec084a30\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 21 00:58:55.106832 containerd[1680]: time="2026-01-21T00:58:55.106643215Z" level=error msg="Failed to destroy network for sandbox \"9feda7c049af49fe830131e9b703c750e060d2b4943d945aaffdc8b4482e4063\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 21 00:58:55.107214 containerd[1680]: time="2026-01-21T00:58:55.107185859Z" level=error msg="Failed to destroy network for sandbox \"7bb12e21a2547d4d608a495b405aa9445967edb6412368a95f06a57255c547b2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 21 00:58:55.110419 containerd[1680]: time="2026-01-21T00:58:55.110386255Z" level=error msg="Failed to destroy network for sandbox \"ec9d8a646055a0abc5f1c6c4a8e67a4b717bda73f0f0632bf9df372abcfa5579\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 21 00:58:55.110621 containerd[1680]: time="2026-01-21T00:58:55.110514869Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-95c8v,Uid:be64252f-80c4-46f6-a3d4-52a6471b1a63,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9feda7c049af49fe830131e9b703c750e060d2b4943d945aaffdc8b4482e4063\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 21 00:58:55.110854 kubelet[2901]: E0121 00:58:55.110785 2901 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9feda7c049af49fe830131e9b703c750e060d2b4943d945aaffdc8b4482e4063\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 21 00:58:55.110943 kubelet[2901]: E0121 00:58:55.110857 2901 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9feda7c049af49fe830131e9b703c750e060d2b4943d945aaffdc8b4482e4063\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-95c8v" Jan 21 00:58:55.110943 kubelet[2901]: E0121 00:58:55.110891 2901 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9feda7c049af49fe830131e9b703c750e060d2b4943d945aaffdc8b4482e4063\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-95c8v" Jan 21 00:58:55.111301 kubelet[2901]: E0121 00:58:55.111274 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-666569f655-95c8v_calico-system(be64252f-80c4-46f6-a3d4-52a6471b1a63)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-666569f655-95c8v_calico-system(be64252f-80c4-46f6-a3d4-52a6471b1a63)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9feda7c049af49fe830131e9b703c750e060d2b4943d945aaffdc8b4482e4063\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-666569f655-95c8v" podUID="be64252f-80c4-46f6-a3d4-52a6471b1a63" Jan 21 00:58:55.116516 containerd[1680]: time="2026-01-21T00:58:55.116410292Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-787f4b58dc-gdnrh,Uid:c4ce6349-e9dc-4bcd-9543-fc9564c69c1a,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"542495fef49f25fa7997d3b2a19ae7c9930542371551123a1dfc7868ec084a30\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 21 00:58:55.116661 kubelet[2901]: E0121 00:58:55.116580 2901 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"542495fef49f25fa7997d3b2a19ae7c9930542371551123a1dfc7868ec084a30\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 21 00:58:55.116661 kubelet[2901]: E0121 00:58:55.116620 2901 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"542495fef49f25fa7997d3b2a19ae7c9930542371551123a1dfc7868ec084a30\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-787f4b58dc-gdnrh" Jan 21 00:58:55.116661 kubelet[2901]: E0121 00:58:55.116640 2901 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"542495fef49f25fa7997d3b2a19ae7c9930542371551123a1dfc7868ec084a30\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-787f4b58dc-gdnrh" Jan 21 00:58:55.116786 kubelet[2901]: E0121 00:58:55.116672 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-787f4b58dc-gdnrh_calico-system(c4ce6349-e9dc-4bcd-9543-fc9564c69c1a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-787f4b58dc-gdnrh_calico-system(c4ce6349-e9dc-4bcd-9543-fc9564c69c1a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"542495fef49f25fa7997d3b2a19ae7c9930542371551123a1dfc7868ec084a30\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-787f4b58dc-gdnrh" podUID="c4ce6349-e9dc-4bcd-9543-fc9564c69c1a" Jan 21 00:58:55.117884 containerd[1680]: time="2026-01-21T00:58:55.117814456Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-bc7877fd9-wf2hg,Uid:59ce7fac-1a6e-4ec4-b99e-063ed3e3444c,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"7bb12e21a2547d4d608a495b405aa9445967edb6412368a95f06a57255c547b2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 21 00:58:55.117974 kubelet[2901]: E0121 00:58:55.117933 2901 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7bb12e21a2547d4d608a495b405aa9445967edb6412368a95f06a57255c547b2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 21 00:58:55.117974 kubelet[2901]: E0121 00:58:55.117964 2901 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7bb12e21a2547d4d608a495b405aa9445967edb6412368a95f06a57255c547b2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-bc7877fd9-wf2hg" Jan 21 00:58:55.118036 kubelet[2901]: E0121 00:58:55.117981 2901 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7bb12e21a2547d4d608a495b405aa9445967edb6412368a95f06a57255c547b2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-bc7877fd9-wf2hg" Jan 21 00:58:55.118126 kubelet[2901]: E0121 00:58:55.118095 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-bc7877fd9-wf2hg_calico-system(59ce7fac-1a6e-4ec4-b99e-063ed3e3444c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-bc7877fd9-wf2hg_calico-system(59ce7fac-1a6e-4ec4-b99e-063ed3e3444c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7bb12e21a2547d4d608a495b405aa9445967edb6412368a95f06a57255c547b2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-bc7877fd9-wf2hg" podUID="59ce7fac-1a6e-4ec4-b99e-063ed3e3444c" Jan 21 00:58:55.119265 containerd[1680]: time="2026-01-21T00:58:55.119221219Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-8lkb9,Uid:851d5829-334a-4f46-97de-87be973a0b77,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"ec9d8a646055a0abc5f1c6c4a8e67a4b717bda73f0f0632bf9df372abcfa5579\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 21 00:58:55.119441 kubelet[2901]: E0121 00:58:55.119413 2901 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ec9d8a646055a0abc5f1c6c4a8e67a4b717bda73f0f0632bf9df372abcfa5579\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 21 00:58:55.119477 kubelet[2901]: E0121 00:58:55.119449 2901 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ec9d8a646055a0abc5f1c6c4a8e67a4b717bda73f0f0632bf9df372abcfa5579\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-8lkb9" Jan 21 00:58:55.119500 kubelet[2901]: E0121 00:58:55.119478 2901 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ec9d8a646055a0abc5f1c6c4a8e67a4b717bda73f0f0632bf9df372abcfa5579\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-8lkb9" Jan 21 00:58:55.119521 kubelet[2901]: E0121 00:58:55.119507 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-8lkb9_calico-system(851d5829-334a-4f46-97de-87be973a0b77)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-8lkb9_calico-system(851d5829-334a-4f46-97de-87be973a0b77)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ec9d8a646055a0abc5f1c6c4a8e67a4b717bda73f0f0632bf9df372abcfa5579\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8lkb9" podUID="851d5829-334a-4f46-97de-87be973a0b77" Jan 21 00:58:55.292198 containerd[1680]: time="2026-01-21T00:58:55.292100317Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\"" Jan 21 00:58:55.856517 systemd[1]: run-netns-cni\x2d77d9a59d\x2da194\x2de5c2\x2d3493\x2d0fde4e300f76.mount: Deactivated successfully. Jan 21 00:58:55.856599 systemd[1]: run-netns-cni\x2d1fb4f933\x2d32a3\x2dc455\x2db83c\x2ded8aaef79a57.mount: Deactivated successfully. Jan 21 00:58:55.856656 systemd[1]: run-netns-cni\x2d01502675\x2d53a1\x2ddf26\x2d645f\x2da32b1d657d12.mount: Deactivated successfully. Jan 21 00:58:55.856711 systemd[1]: run-netns-cni\x2d818e8e42\x2d748f\x2da2b7\x2d3892\x2debcbe70bfd4b.mount: Deactivated successfully. Jan 21 00:58:55.856753 systemd[1]: run-netns-cni\x2d76e0f2af\x2da878\x2dccc5\x2daf88\x2d658b5f14c570.mount: Deactivated successfully. Jan 21 00:58:55.856795 systemd[1]: run-netns-cni\x2d591ab913\x2dac9b\x2df9ef\x2dddff\x2d2f482f37d6fa.mount: Deactivated successfully. Jan 21 00:59:00.508632 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3552380539.mount: Deactivated successfully. Jan 21 00:59:00.535431 containerd[1680]: time="2026-01-21T00:59:00.535382809Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 21 00:59:00.537623 containerd[1680]: time="2026-01-21T00:59:00.537600245Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.4: active requests=0, bytes read=156880025" Jan 21 00:59:00.539200 containerd[1680]: time="2026-01-21T00:59:00.539166114Z" level=info msg="ImageCreate event name:\"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 21 00:59:00.541827 containerd[1680]: time="2026-01-21T00:59:00.541796847Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 21 00:59:00.542381 containerd[1680]: time="2026-01-21T00:59:00.542331749Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.4\" with image id \"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\", size \"156883537\" in 5.250198169s" Jan 21 00:59:00.542381 containerd[1680]: time="2026-01-21T00:59:00.542361023Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\" returns image reference \"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\"" Jan 21 00:59:00.556574 containerd[1680]: time="2026-01-21T00:59:00.556542617Z" level=info msg="CreateContainer within sandbox \"ff3104a46f20fde544d2a09cd0f2ae92f4b166ddb5c3f82fc868ab034ff4f89f\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Jan 21 00:59:00.566152 kubelet[2901]: I0121 00:59:00.566115 2901 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 21 00:59:00.578713 containerd[1680]: time="2026-01-21T00:59:00.578609432Z" level=info msg="Container 39f230ded28d9f1052e6b95fe3cff3b1f0560fd23e6a58ea75c2b08f93b8cdf2: CDI devices from CRI Config.CDIDevices: []" Jan 21 00:59:00.592468 containerd[1680]: time="2026-01-21T00:59:00.592427624Z" level=info msg="CreateContainer within sandbox \"ff3104a46f20fde544d2a09cd0f2ae92f4b166ddb5c3f82fc868ab034ff4f89f\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"39f230ded28d9f1052e6b95fe3cff3b1f0560fd23e6a58ea75c2b08f93b8cdf2\"" Jan 21 00:59:00.593850 containerd[1680]: time="2026-01-21T00:59:00.593803885Z" level=info msg="StartContainer for \"39f230ded28d9f1052e6b95fe3cff3b1f0560fd23e6a58ea75c2b08f93b8cdf2\"" Jan 21 00:59:00.595571 containerd[1680]: time="2026-01-21T00:59:00.595546049Z" level=info msg="connecting to shim 39f230ded28d9f1052e6b95fe3cff3b1f0560fd23e6a58ea75c2b08f93b8cdf2" address="unix:///run/containerd/s/5d5ee1b77ec652855177580ca59af291888096523a75d311b38a2ae908263b56" protocol=ttrpc version=3 Jan 21 00:59:00.607000 audit[3927]: NETFILTER_CFG table=filter:117 family=2 entries=21 op=nft_register_rule pid=3927 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 21 00:59:00.610136 kernel: kauditd_printk_skb: 6 callbacks suppressed Jan 21 00:59:00.610184 kernel: audit: type=1325 audit(1768957140.607:573): table=filter:117 family=2 entries=21 op=nft_register_rule pid=3927 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 21 00:59:00.607000 audit[3927]: SYSCALL arch=c000003e syscall=46 success=yes exit=7480 a0=3 a1=7ffed1b95210 a2=0 a3=7ffed1b951fc items=0 ppid=3056 pid=3927 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:00.620993 kernel: audit: type=1300 audit(1768957140.607:573): arch=c000003e syscall=46 success=yes exit=7480 a0=3 a1=7ffed1b95210 a2=0 a3=7ffed1b951fc items=0 ppid=3056 pid=3927 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:00.607000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 21 00:59:00.624067 kernel: audit: type=1327 audit(1768957140.607:573): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 21 00:59:00.624118 kernel: audit: type=1325 audit(1768957140.620:574): table=nat:118 family=2 entries=19 op=nft_register_chain pid=3927 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 21 00:59:00.620000 audit[3927]: NETFILTER_CFG table=nat:118 family=2 entries=19 op=nft_register_chain pid=3927 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 21 00:59:00.620000 audit[3927]: SYSCALL arch=c000003e syscall=46 success=yes exit=6276 a0=3 a1=7ffed1b95210 a2=0 a3=7ffed1b951fc items=0 ppid=3056 pid=3927 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:00.628271 kernel: audit: type=1300 audit(1768957140.620:574): arch=c000003e syscall=46 success=yes exit=6276 a0=3 a1=7ffed1b95210 a2=0 a3=7ffed1b951fc items=0 ppid=3056 pid=3927 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:00.620000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 21 00:59:00.634766 kernel: audit: type=1327 audit(1768957140.620:574): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 21 00:59:00.655185 systemd[1]: Started cri-containerd-39f230ded28d9f1052e6b95fe3cff3b1f0560fd23e6a58ea75c2b08f93b8cdf2.scope - libcontainer container 39f230ded28d9f1052e6b95fe3cff3b1f0560fd23e6a58ea75c2b08f93b8cdf2. Jan 21 00:59:00.696000 audit: BPF prog-id=172 op=LOAD Jan 21 00:59:00.696000 audit[3925]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=3429 pid=3925 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:00.701218 kernel: audit: type=1334 audit(1768957140.696:575): prog-id=172 op=LOAD Jan 21 00:59:00.701275 kernel: audit: type=1300 audit(1768957140.696:575): arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=3429 pid=3925 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:00.696000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3339663233306465643238643966313035326536623935666533636666 Jan 21 00:59:00.696000 audit: BPF prog-id=173 op=LOAD Jan 21 00:59:00.708884 kernel: audit: type=1327 audit(1768957140.696:575): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3339663233306465643238643966313035326536623935666533636666 Jan 21 00:59:00.708932 kernel: audit: type=1334 audit(1768957140.696:576): prog-id=173 op=LOAD Jan 21 00:59:00.696000 audit[3925]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=3429 pid=3925 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:00.696000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3339663233306465643238643966313035326536623935666533636666 Jan 21 00:59:00.696000 audit: BPF prog-id=173 op=UNLOAD Jan 21 00:59:00.696000 audit[3925]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3429 pid=3925 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:00.696000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3339663233306465643238643966313035326536623935666533636666 Jan 21 00:59:00.696000 audit: BPF prog-id=172 op=UNLOAD Jan 21 00:59:00.696000 audit[3925]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3429 pid=3925 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:00.696000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3339663233306465643238643966313035326536623935666533636666 Jan 21 00:59:00.696000 audit: BPF prog-id=174 op=LOAD Jan 21 00:59:00.696000 audit[3925]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=3429 pid=3925 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:00.696000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3339663233306465643238643966313035326536623935666533636666 Jan 21 00:59:00.730024 containerd[1680]: time="2026-01-21T00:59:00.729989290Z" level=info msg="StartContainer for \"39f230ded28d9f1052e6b95fe3cff3b1f0560fd23e6a58ea75c2b08f93b8cdf2\" returns successfully" Jan 21 00:59:00.827064 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Jan 21 00:59:00.827267 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Jan 21 00:59:00.946189 kubelet[2901]: I0121 00:59:00.946057 2901 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4ce6349-e9dc-4bcd-9543-fc9564c69c1a-whisker-ca-bundle\") pod \"c4ce6349-e9dc-4bcd-9543-fc9564c69c1a\" (UID: \"c4ce6349-e9dc-4bcd-9543-fc9564c69c1a\") " Jan 21 00:59:00.946453 kubelet[2901]: I0121 00:59:00.946281 2901 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/c4ce6349-e9dc-4bcd-9543-fc9564c69c1a-whisker-backend-key-pair\") pod \"c4ce6349-e9dc-4bcd-9543-fc9564c69c1a\" (UID: \"c4ce6349-e9dc-4bcd-9543-fc9564c69c1a\") " Jan 21 00:59:00.946483 kubelet[2901]: I0121 00:59:00.946455 2901 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4ce6349-e9dc-4bcd-9543-fc9564c69c1a-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "c4ce6349-e9dc-4bcd-9543-fc9564c69c1a" (UID: "c4ce6349-e9dc-4bcd-9543-fc9564c69c1a"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Jan 21 00:59:00.946937 kubelet[2901]: I0121 00:59:00.946793 2901 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brvdc\" (UniqueName: \"kubernetes.io/projected/c4ce6349-e9dc-4bcd-9543-fc9564c69c1a-kube-api-access-brvdc\") pod \"c4ce6349-e9dc-4bcd-9543-fc9564c69c1a\" (UID: \"c4ce6349-e9dc-4bcd-9543-fc9564c69c1a\") " Jan 21 00:59:00.946937 kubelet[2901]: I0121 00:59:00.946898 2901 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4ce6349-e9dc-4bcd-9543-fc9564c69c1a-whisker-ca-bundle\") on node \"ci-4547-0-0-n-af1f1f5a24\" DevicePath \"\"" Jan 21 00:59:00.956779 kubelet[2901]: I0121 00:59:00.956717 2901 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4ce6349-e9dc-4bcd-9543-fc9564c69c1a-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "c4ce6349-e9dc-4bcd-9543-fc9564c69c1a" (UID: "c4ce6349-e9dc-4bcd-9543-fc9564c69c1a"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Jan 21 00:59:00.956982 kubelet[2901]: I0121 00:59:00.956958 2901 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4ce6349-e9dc-4bcd-9543-fc9564c69c1a-kube-api-access-brvdc" (OuterVolumeSpecName: "kube-api-access-brvdc") pod "c4ce6349-e9dc-4bcd-9543-fc9564c69c1a" (UID: "c4ce6349-e9dc-4bcd-9543-fc9564c69c1a"). InnerVolumeSpecName "kube-api-access-brvdc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Jan 21 00:59:01.048150 kubelet[2901]: I0121 00:59:01.048088 2901 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-brvdc\" (UniqueName: \"kubernetes.io/projected/c4ce6349-e9dc-4bcd-9543-fc9564c69c1a-kube-api-access-brvdc\") on node \"ci-4547-0-0-n-af1f1f5a24\" DevicePath \"\"" Jan 21 00:59:01.048150 kubelet[2901]: I0121 00:59:01.048116 2901 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/c4ce6349-e9dc-4bcd-9543-fc9564c69c1a-whisker-backend-key-pair\") on node \"ci-4547-0-0-n-af1f1f5a24\" DevicePath \"\"" Jan 21 00:59:01.217480 systemd[1]: Removed slice kubepods-besteffort-podc4ce6349_e9dc_4bcd_9543_fc9564c69c1a.slice - libcontainer container kubepods-besteffort-podc4ce6349_e9dc_4bcd_9543_fc9564c69c1a.slice. Jan 21 00:59:01.326217 kubelet[2901]: I0121 00:59:01.325918 2901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-m2vxz" podStartSLOduration=1.73509106 podStartE2EDuration="18.325893113s" podCreationTimestamp="2026-01-21 00:58:43 +0000 UTC" firstStartedPulling="2026-01-21 00:58:43.952404468 +0000 UTC m=+18.893020065" lastFinishedPulling="2026-01-21 00:59:00.543206515 +0000 UTC m=+35.483822118" observedRunningTime="2026-01-21 00:59:01.324980198 +0000 UTC m=+36.265595818" watchObservedRunningTime="2026-01-21 00:59:01.325893113 +0000 UTC m=+36.266509102" Jan 21 00:59:01.384001 systemd[1]: Created slice kubepods-besteffort-pod9f88865d_a485_4a8a_b0f8_118c9593a73c.slice - libcontainer container kubepods-besteffort-pod9f88865d_a485_4a8a_b0f8_118c9593a73c.slice. Jan 21 00:59:01.450713 kubelet[2901]: I0121 00:59:01.450425 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f88865d-a485-4a8a-b0f8-118c9593a73c-whisker-ca-bundle\") pod \"whisker-64874dbd99-cd26r\" (UID: \"9f88865d-a485-4a8a-b0f8-118c9593a73c\") " pod="calico-system/whisker-64874dbd99-cd26r" Jan 21 00:59:01.452698 kubelet[2901]: I0121 00:59:01.450877 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sl949\" (UniqueName: \"kubernetes.io/projected/9f88865d-a485-4a8a-b0f8-118c9593a73c-kube-api-access-sl949\") pod \"whisker-64874dbd99-cd26r\" (UID: \"9f88865d-a485-4a8a-b0f8-118c9593a73c\") " pod="calico-system/whisker-64874dbd99-cd26r" Jan 21 00:59:01.452698 kubelet[2901]: I0121 00:59:01.450923 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/9f88865d-a485-4a8a-b0f8-118c9593a73c-whisker-backend-key-pair\") pod \"whisker-64874dbd99-cd26r\" (UID: \"9f88865d-a485-4a8a-b0f8-118c9593a73c\") " pod="calico-system/whisker-64874dbd99-cd26r" Jan 21 00:59:01.509548 systemd[1]: var-lib-kubelet-pods-c4ce6349\x2de9dc\x2d4bcd\x2d9543\x2dfc9564c69c1a-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dbrvdc.mount: Deactivated successfully. Jan 21 00:59:01.509911 systemd[1]: var-lib-kubelet-pods-c4ce6349\x2de9dc\x2d4bcd\x2d9543\x2dfc9564c69c1a-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Jan 21 00:59:01.688371 containerd[1680]: time="2026-01-21T00:59:01.688330296Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-64874dbd99-cd26r,Uid:9f88865d-a485-4a8a-b0f8-118c9593a73c,Namespace:calico-system,Attempt:0,}" Jan 21 00:59:01.916967 systemd-networkd[1587]: cali71d7cf119ac: Link UP Jan 21 00:59:01.918125 systemd-networkd[1587]: cali71d7cf119ac: Gained carrier Jan 21 00:59:01.935441 containerd[1680]: 2026-01-21 00:59:01.730 [INFO][4019] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 21 00:59:01.935441 containerd[1680]: 2026-01-21 00:59:01.829 [INFO][4019] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--n--af1f1f5a24-k8s-whisker--64874dbd99--cd26r-eth0 whisker-64874dbd99- calico-system 9f88865d-a485-4a8a-b0f8-118c9593a73c 859 0 2026-01-21 00:59:01 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:64874dbd99 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4547-0-0-n-af1f1f5a24 whisker-64874dbd99-cd26r eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali71d7cf119ac [] [] }} ContainerID="2db3a13dce2b340118b632500a08b041c885e7d8158732cad3dd0b928e5c62c3" Namespace="calico-system" Pod="whisker-64874dbd99-cd26r" WorkloadEndpoint="ci--4547--0--0--n--af1f1f5a24-k8s-whisker--64874dbd99--cd26r-" Jan 21 00:59:01.935441 containerd[1680]: 2026-01-21 00:59:01.829 [INFO][4019] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="2db3a13dce2b340118b632500a08b041c885e7d8158732cad3dd0b928e5c62c3" Namespace="calico-system" Pod="whisker-64874dbd99-cd26r" WorkloadEndpoint="ci--4547--0--0--n--af1f1f5a24-k8s-whisker--64874dbd99--cd26r-eth0" Jan 21 00:59:01.935441 containerd[1680]: 2026-01-21 00:59:01.871 [INFO][4031] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="2db3a13dce2b340118b632500a08b041c885e7d8158732cad3dd0b928e5c62c3" HandleID="k8s-pod-network.2db3a13dce2b340118b632500a08b041c885e7d8158732cad3dd0b928e5c62c3" Workload="ci--4547--0--0--n--af1f1f5a24-k8s-whisker--64874dbd99--cd26r-eth0" Jan 21 00:59:01.935671 containerd[1680]: 2026-01-21 00:59:01.871 [INFO][4031] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="2db3a13dce2b340118b632500a08b041c885e7d8158732cad3dd0b928e5c62c3" HandleID="k8s-pod-network.2db3a13dce2b340118b632500a08b041c885e7d8158732cad3dd0b928e5c62c3" Workload="ci--4547--0--0--n--af1f1f5a24-k8s-whisker--64874dbd99--cd26r-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002c7220), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4547-0-0-n-af1f1f5a24", "pod":"whisker-64874dbd99-cd26r", "timestamp":"2026-01-21 00:59:01.871850197 +0000 UTC"}, Hostname:"ci-4547-0-0-n-af1f1f5a24", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 21 00:59:01.935671 containerd[1680]: 2026-01-21 00:59:01.872 [INFO][4031] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 21 00:59:01.935671 containerd[1680]: 2026-01-21 00:59:01.872 [INFO][4031] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 21 00:59:01.935671 containerd[1680]: 2026-01-21 00:59:01.872 [INFO][4031] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-n-af1f1f5a24' Jan 21 00:59:01.935671 containerd[1680]: 2026-01-21 00:59:01.879 [INFO][4031] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.2db3a13dce2b340118b632500a08b041c885e7d8158732cad3dd0b928e5c62c3" host="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:59:01.935671 containerd[1680]: 2026-01-21 00:59:01.883 [INFO][4031] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:59:01.935671 containerd[1680]: 2026-01-21 00:59:01.886 [INFO][4031] ipam/ipam.go 511: Trying affinity for 192.168.74.0/26 host="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:59:01.935671 containerd[1680]: 2026-01-21 00:59:01.887 [INFO][4031] ipam/ipam.go 158: Attempting to load block cidr=192.168.74.0/26 host="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:59:01.935671 containerd[1680]: 2026-01-21 00:59:01.889 [INFO][4031] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.74.0/26 host="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:59:01.935919 containerd[1680]: 2026-01-21 00:59:01.889 [INFO][4031] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.74.0/26 handle="k8s-pod-network.2db3a13dce2b340118b632500a08b041c885e7d8158732cad3dd0b928e5c62c3" host="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:59:01.935919 containerd[1680]: 2026-01-21 00:59:01.890 [INFO][4031] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.2db3a13dce2b340118b632500a08b041c885e7d8158732cad3dd0b928e5c62c3 Jan 21 00:59:01.935919 containerd[1680]: 2026-01-21 00:59:01.897 [INFO][4031] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.74.0/26 handle="k8s-pod-network.2db3a13dce2b340118b632500a08b041c885e7d8158732cad3dd0b928e5c62c3" host="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:59:01.935919 containerd[1680]: 2026-01-21 00:59:01.901 [INFO][4031] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.74.1/26] block=192.168.74.0/26 handle="k8s-pod-network.2db3a13dce2b340118b632500a08b041c885e7d8158732cad3dd0b928e5c62c3" host="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:59:01.935919 containerd[1680]: 2026-01-21 00:59:01.901 [INFO][4031] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.74.1/26] handle="k8s-pod-network.2db3a13dce2b340118b632500a08b041c885e7d8158732cad3dd0b928e5c62c3" host="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:59:01.935919 containerd[1680]: 2026-01-21 00:59:01.901 [INFO][4031] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 21 00:59:01.935919 containerd[1680]: 2026-01-21 00:59:01.901 [INFO][4031] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.74.1/26] IPv6=[] ContainerID="2db3a13dce2b340118b632500a08b041c885e7d8158732cad3dd0b928e5c62c3" HandleID="k8s-pod-network.2db3a13dce2b340118b632500a08b041c885e7d8158732cad3dd0b928e5c62c3" Workload="ci--4547--0--0--n--af1f1f5a24-k8s-whisker--64874dbd99--cd26r-eth0" Jan 21 00:59:01.936055 containerd[1680]: 2026-01-21 00:59:01.905 [INFO][4019] cni-plugin/k8s.go 418: Populated endpoint ContainerID="2db3a13dce2b340118b632500a08b041c885e7d8158732cad3dd0b928e5c62c3" Namespace="calico-system" Pod="whisker-64874dbd99-cd26r" WorkloadEndpoint="ci--4547--0--0--n--af1f1f5a24-k8s-whisker--64874dbd99--cd26r-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--n--af1f1f5a24-k8s-whisker--64874dbd99--cd26r-eth0", GenerateName:"whisker-64874dbd99-", Namespace:"calico-system", SelfLink:"", UID:"9f88865d-a485-4a8a-b0f8-118c9593a73c", ResourceVersion:"859", Generation:0, CreationTimestamp:time.Date(2026, time.January, 21, 0, 59, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"64874dbd99", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-n-af1f1f5a24", ContainerID:"", Pod:"whisker-64874dbd99-cd26r", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.74.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali71d7cf119ac", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 21 00:59:01.936055 containerd[1680]: 2026-01-21 00:59:01.905 [INFO][4019] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.74.1/32] ContainerID="2db3a13dce2b340118b632500a08b041c885e7d8158732cad3dd0b928e5c62c3" Namespace="calico-system" Pod="whisker-64874dbd99-cd26r" WorkloadEndpoint="ci--4547--0--0--n--af1f1f5a24-k8s-whisker--64874dbd99--cd26r-eth0" Jan 21 00:59:01.936129 containerd[1680]: 2026-01-21 00:59:01.905 [INFO][4019] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali71d7cf119ac ContainerID="2db3a13dce2b340118b632500a08b041c885e7d8158732cad3dd0b928e5c62c3" Namespace="calico-system" Pod="whisker-64874dbd99-cd26r" WorkloadEndpoint="ci--4547--0--0--n--af1f1f5a24-k8s-whisker--64874dbd99--cd26r-eth0" Jan 21 00:59:01.936129 containerd[1680]: 2026-01-21 00:59:01.919 [INFO][4019] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="2db3a13dce2b340118b632500a08b041c885e7d8158732cad3dd0b928e5c62c3" Namespace="calico-system" Pod="whisker-64874dbd99-cd26r" WorkloadEndpoint="ci--4547--0--0--n--af1f1f5a24-k8s-whisker--64874dbd99--cd26r-eth0" Jan 21 00:59:01.936169 containerd[1680]: 2026-01-21 00:59:01.919 [INFO][4019] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="2db3a13dce2b340118b632500a08b041c885e7d8158732cad3dd0b928e5c62c3" Namespace="calico-system" Pod="whisker-64874dbd99-cd26r" WorkloadEndpoint="ci--4547--0--0--n--af1f1f5a24-k8s-whisker--64874dbd99--cd26r-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--n--af1f1f5a24-k8s-whisker--64874dbd99--cd26r-eth0", GenerateName:"whisker-64874dbd99-", Namespace:"calico-system", SelfLink:"", UID:"9f88865d-a485-4a8a-b0f8-118c9593a73c", ResourceVersion:"859", Generation:0, CreationTimestamp:time.Date(2026, time.January, 21, 0, 59, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"64874dbd99", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-n-af1f1f5a24", ContainerID:"2db3a13dce2b340118b632500a08b041c885e7d8158732cad3dd0b928e5c62c3", Pod:"whisker-64874dbd99-cd26r", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.74.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali71d7cf119ac", MAC:"3a:5a:80:6e:5d:57", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 21 00:59:01.936220 containerd[1680]: 2026-01-21 00:59:01.932 [INFO][4019] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="2db3a13dce2b340118b632500a08b041c885e7d8158732cad3dd0b928e5c62c3" Namespace="calico-system" Pod="whisker-64874dbd99-cd26r" WorkloadEndpoint="ci--4547--0--0--n--af1f1f5a24-k8s-whisker--64874dbd99--cd26r-eth0" Jan 21 00:59:01.989772 containerd[1680]: time="2026-01-21T00:59:01.989225092Z" level=info msg="connecting to shim 2db3a13dce2b340118b632500a08b041c885e7d8158732cad3dd0b928e5c62c3" address="unix:///run/containerd/s/ec4ad26ff06939119947585d924ade47ea21d281404a0100e1ab88c6a7471f6b" namespace=k8s.io protocol=ttrpc version=3 Jan 21 00:59:02.019946 systemd[1]: Started cri-containerd-2db3a13dce2b340118b632500a08b041c885e7d8158732cad3dd0b928e5c62c3.scope - libcontainer container 2db3a13dce2b340118b632500a08b041c885e7d8158732cad3dd0b928e5c62c3. Jan 21 00:59:02.032000 audit: BPF prog-id=175 op=LOAD Jan 21 00:59:02.033000 audit: BPF prog-id=176 op=LOAD Jan 21 00:59:02.033000 audit[4068]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000138238 a2=98 a3=0 items=0 ppid=4056 pid=4068 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:02.033000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3264623361313364636532623334303131386236333235303061303862 Jan 21 00:59:02.033000 audit: BPF prog-id=176 op=UNLOAD Jan 21 00:59:02.033000 audit[4068]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4056 pid=4068 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:02.033000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3264623361313364636532623334303131386236333235303061303862 Jan 21 00:59:02.033000 audit: BPF prog-id=177 op=LOAD Jan 21 00:59:02.033000 audit[4068]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000138488 a2=98 a3=0 items=0 ppid=4056 pid=4068 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:02.033000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3264623361313364636532623334303131386236333235303061303862 Jan 21 00:59:02.033000 audit: BPF prog-id=178 op=LOAD Jan 21 00:59:02.033000 audit[4068]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000138218 a2=98 a3=0 items=0 ppid=4056 pid=4068 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:02.033000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3264623361313364636532623334303131386236333235303061303862 Jan 21 00:59:02.033000 audit: BPF prog-id=178 op=UNLOAD Jan 21 00:59:02.033000 audit[4068]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4056 pid=4068 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:02.033000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3264623361313364636532623334303131386236333235303061303862 Jan 21 00:59:02.033000 audit: BPF prog-id=177 op=UNLOAD Jan 21 00:59:02.033000 audit[4068]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4056 pid=4068 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:02.033000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3264623361313364636532623334303131386236333235303061303862 Jan 21 00:59:02.033000 audit: BPF prog-id=179 op=LOAD Jan 21 00:59:02.033000 audit[4068]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001386e8 a2=98 a3=0 items=0 ppid=4056 pid=4068 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:02.033000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3264623361313364636532623334303131386236333235303061303862 Jan 21 00:59:02.067490 containerd[1680]: time="2026-01-21T00:59:02.067404841Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-64874dbd99-cd26r,Uid:9f88865d-a485-4a8a-b0f8-118c9593a73c,Namespace:calico-system,Attempt:0,} returns sandbox id \"2db3a13dce2b340118b632500a08b041c885e7d8158732cad3dd0b928e5c62c3\"" Jan 21 00:59:02.069114 containerd[1680]: time="2026-01-21T00:59:02.069086893Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 21 00:59:02.400699 containerd[1680]: time="2026-01-21T00:59:02.400631222Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 21 00:59:02.402794 containerd[1680]: time="2026-01-21T00:59:02.402691600Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 21 00:59:02.402794 containerd[1680]: time="2026-01-21T00:59:02.402709947Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 21 00:59:02.403028 kubelet[2901]: E0121 00:59:02.402987 2901 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 21 00:59:02.403305 kubelet[2901]: E0121 00:59:02.403043 2901 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 21 00:59:02.409873 kubelet[2901]: E0121 00:59:02.409795 2901 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:8f6a270e2c9b4b408eb629752f61ddd6,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sl949,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-64874dbd99-cd26r_calico-system(9f88865d-a485-4a8a-b0f8-118c9593a73c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 21 00:59:02.413778 containerd[1680]: time="2026-01-21T00:59:02.413744221Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 21 00:59:02.530000 audit: BPF prog-id=180 op=LOAD Jan 21 00:59:02.530000 audit[4237]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffdc7cf0020 a2=98 a3=1fffffffffffffff items=0 ppid=4119 pid=4237 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:02.530000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 21 00:59:02.530000 audit: BPF prog-id=180 op=UNLOAD Jan 21 00:59:02.530000 audit[4237]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7ffdc7cefff0 a3=0 items=0 ppid=4119 pid=4237 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:02.530000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 21 00:59:02.530000 audit: BPF prog-id=181 op=LOAD Jan 21 00:59:02.530000 audit[4237]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffdc7ceff00 a2=94 a3=3 items=0 ppid=4119 pid=4237 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:02.530000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 21 00:59:02.530000 audit: BPF prog-id=181 op=UNLOAD Jan 21 00:59:02.530000 audit[4237]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7ffdc7ceff00 a2=94 a3=3 items=0 ppid=4119 pid=4237 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:02.530000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 21 00:59:02.530000 audit: BPF prog-id=182 op=LOAD Jan 21 00:59:02.530000 audit[4237]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffdc7ceff40 a2=94 a3=7ffdc7cf0120 items=0 ppid=4119 pid=4237 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:02.530000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 21 00:59:02.530000 audit: BPF prog-id=182 op=UNLOAD Jan 21 00:59:02.530000 audit[4237]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7ffdc7ceff40 a2=94 a3=7ffdc7cf0120 items=0 ppid=4119 pid=4237 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:02.530000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 21 00:59:02.532000 audit: BPF prog-id=183 op=LOAD Jan 21 00:59:02.532000 audit[4238]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffcaa9d5740 a2=98 a3=3 items=0 ppid=4119 pid=4238 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:02.532000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 21 00:59:02.532000 audit: BPF prog-id=183 op=UNLOAD Jan 21 00:59:02.532000 audit[4238]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7ffcaa9d5710 a3=0 items=0 ppid=4119 pid=4238 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:02.532000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 21 00:59:02.533000 audit: BPF prog-id=184 op=LOAD Jan 21 00:59:02.533000 audit[4238]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffcaa9d5530 a2=94 a3=54428f items=0 ppid=4119 pid=4238 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:02.533000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 21 00:59:02.533000 audit: BPF prog-id=184 op=UNLOAD Jan 21 00:59:02.533000 audit[4238]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffcaa9d5530 a2=94 a3=54428f items=0 ppid=4119 pid=4238 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:02.533000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 21 00:59:02.533000 audit: BPF prog-id=185 op=LOAD Jan 21 00:59:02.533000 audit[4238]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffcaa9d5560 a2=94 a3=2 items=0 ppid=4119 pid=4238 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:02.533000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 21 00:59:02.533000 audit: BPF prog-id=185 op=UNLOAD Jan 21 00:59:02.533000 audit[4238]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffcaa9d5560 a2=0 a3=2 items=0 ppid=4119 pid=4238 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:02.533000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 21 00:59:02.683000 audit: BPF prog-id=186 op=LOAD Jan 21 00:59:02.683000 audit[4238]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffcaa9d5420 a2=94 a3=1 items=0 ppid=4119 pid=4238 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:02.683000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 21 00:59:02.683000 audit: BPF prog-id=186 op=UNLOAD Jan 21 00:59:02.683000 audit[4238]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffcaa9d5420 a2=94 a3=1 items=0 ppid=4119 pid=4238 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:02.683000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 21 00:59:02.694000 audit: BPF prog-id=187 op=LOAD Jan 21 00:59:02.694000 audit[4238]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffcaa9d5410 a2=94 a3=4 items=0 ppid=4119 pid=4238 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:02.694000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 21 00:59:02.694000 audit: BPF prog-id=187 op=UNLOAD Jan 21 00:59:02.694000 audit[4238]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=7ffcaa9d5410 a2=0 a3=4 items=0 ppid=4119 pid=4238 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:02.694000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 21 00:59:02.694000 audit: BPF prog-id=188 op=LOAD Jan 21 00:59:02.694000 audit[4238]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffcaa9d5270 a2=94 a3=5 items=0 ppid=4119 pid=4238 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:02.694000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 21 00:59:02.694000 audit: BPF prog-id=188 op=UNLOAD Jan 21 00:59:02.694000 audit[4238]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7ffcaa9d5270 a2=0 a3=5 items=0 ppid=4119 pid=4238 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:02.694000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 21 00:59:02.694000 audit: BPF prog-id=189 op=LOAD Jan 21 00:59:02.694000 audit[4238]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffcaa9d5490 a2=94 a3=6 items=0 ppid=4119 pid=4238 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:02.694000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 21 00:59:02.694000 audit: BPF prog-id=189 op=UNLOAD Jan 21 00:59:02.694000 audit[4238]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=7ffcaa9d5490 a2=0 a3=6 items=0 ppid=4119 pid=4238 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:02.694000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 21 00:59:02.694000 audit: BPF prog-id=190 op=LOAD Jan 21 00:59:02.694000 audit[4238]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffcaa9d4c40 a2=94 a3=88 items=0 ppid=4119 pid=4238 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:02.694000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 21 00:59:02.695000 audit: BPF prog-id=191 op=LOAD Jan 21 00:59:02.695000 audit[4238]: SYSCALL arch=c000003e syscall=321 success=yes exit=7 a0=5 a1=7ffcaa9d4ac0 a2=94 a3=2 items=0 ppid=4119 pid=4238 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:02.695000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 21 00:59:02.695000 audit: BPF prog-id=191 op=UNLOAD Jan 21 00:59:02.695000 audit[4238]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=7 a1=7ffcaa9d4af0 a2=0 a3=7ffcaa9d4bf0 items=0 ppid=4119 pid=4238 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:02.695000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 21 00:59:02.695000 audit: BPF prog-id=190 op=UNLOAD Jan 21 00:59:02.695000 audit[4238]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=1e0e1d10 a2=0 a3=a5a04d01df98213 items=0 ppid=4119 pid=4238 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:02.695000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 21 00:59:02.703000 audit: BPF prog-id=192 op=LOAD Jan 21 00:59:02.703000 audit[4241]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffd1f765a60 a2=98 a3=1999999999999999 items=0 ppid=4119 pid=4241 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:02.703000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 21 00:59:02.703000 audit: BPF prog-id=192 op=UNLOAD Jan 21 00:59:02.703000 audit[4241]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7ffd1f765a30 a3=0 items=0 ppid=4119 pid=4241 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:02.703000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 21 00:59:02.703000 audit: BPF prog-id=193 op=LOAD Jan 21 00:59:02.703000 audit[4241]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffd1f765940 a2=94 a3=ffff items=0 ppid=4119 pid=4241 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:02.703000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 21 00:59:02.703000 audit: BPF prog-id=193 op=UNLOAD Jan 21 00:59:02.703000 audit[4241]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7ffd1f765940 a2=94 a3=ffff items=0 ppid=4119 pid=4241 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:02.703000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 21 00:59:02.703000 audit: BPF prog-id=194 op=LOAD Jan 21 00:59:02.703000 audit[4241]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffd1f765980 a2=94 a3=7ffd1f765b60 items=0 ppid=4119 pid=4241 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:02.703000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 21 00:59:02.703000 audit: BPF prog-id=194 op=UNLOAD Jan 21 00:59:02.703000 audit[4241]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7ffd1f765980 a2=94 a3=7ffd1f765b60 items=0 ppid=4119 pid=4241 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:02.703000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 21 00:59:02.759692 systemd-networkd[1587]: vxlan.calico: Link UP Jan 21 00:59:02.759700 systemd-networkd[1587]: vxlan.calico: Gained carrier Jan 21 00:59:02.760224 containerd[1680]: time="2026-01-21T00:59:02.760086895Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 21 00:59:02.766736 containerd[1680]: time="2026-01-21T00:59:02.766626120Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 21 00:59:02.767796 containerd[1680]: time="2026-01-21T00:59:02.766715349Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 21 00:59:02.768296 kubelet[2901]: E0121 00:59:02.767945 2901 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 21 00:59:02.768296 kubelet[2901]: E0121 00:59:02.767991 2901 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 21 00:59:02.768649 kubelet[2901]: E0121 00:59:02.768095 2901 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sl949,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-64874dbd99-cd26r_calico-system(9f88865d-a485-4a8a-b0f8-118c9593a73c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 21 00:59:02.770652 kubelet[2901]: E0121 00:59:02.769574 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-64874dbd99-cd26r" podUID="9f88865d-a485-4a8a-b0f8-118c9593a73c" Jan 21 00:59:02.792000 audit: BPF prog-id=195 op=LOAD Jan 21 00:59:02.792000 audit[4268]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7fff99402a70 a2=98 a3=0 items=0 ppid=4119 pid=4268 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:02.792000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 21 00:59:02.792000 audit: BPF prog-id=195 op=UNLOAD Jan 21 00:59:02.792000 audit[4268]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7fff99402a40 a3=0 items=0 ppid=4119 pid=4268 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:02.792000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 21 00:59:02.792000 audit: BPF prog-id=196 op=LOAD Jan 21 00:59:02.792000 audit[4268]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7fff99402880 a2=94 a3=54428f items=0 ppid=4119 pid=4268 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:02.792000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 21 00:59:02.792000 audit: BPF prog-id=196 op=UNLOAD Jan 21 00:59:02.792000 audit[4268]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7fff99402880 a2=94 a3=54428f items=0 ppid=4119 pid=4268 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:02.792000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 21 00:59:02.792000 audit: BPF prog-id=197 op=LOAD Jan 21 00:59:02.792000 audit[4268]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7fff994028b0 a2=94 a3=2 items=0 ppid=4119 pid=4268 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:02.792000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 21 00:59:02.792000 audit: BPF prog-id=197 op=UNLOAD Jan 21 00:59:02.792000 audit[4268]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7fff994028b0 a2=0 a3=2 items=0 ppid=4119 pid=4268 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:02.792000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 21 00:59:02.792000 audit: BPF prog-id=198 op=LOAD Jan 21 00:59:02.792000 audit[4268]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7fff99402660 a2=94 a3=4 items=0 ppid=4119 pid=4268 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:02.792000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 21 00:59:02.792000 audit: BPF prog-id=198 op=UNLOAD Jan 21 00:59:02.792000 audit[4268]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7fff99402660 a2=94 a3=4 items=0 ppid=4119 pid=4268 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:02.792000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 21 00:59:02.792000 audit: BPF prog-id=199 op=LOAD Jan 21 00:59:02.792000 audit[4268]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7fff99402760 a2=94 a3=7fff994028e0 items=0 ppid=4119 pid=4268 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:02.792000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 21 00:59:02.792000 audit: BPF prog-id=199 op=UNLOAD Jan 21 00:59:02.792000 audit[4268]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7fff99402760 a2=0 a3=7fff994028e0 items=0 ppid=4119 pid=4268 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:02.792000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 21 00:59:02.793000 audit: BPF prog-id=200 op=LOAD Jan 21 00:59:02.793000 audit[4268]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7fff99401e90 a2=94 a3=2 items=0 ppid=4119 pid=4268 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:02.793000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 21 00:59:02.793000 audit: BPF prog-id=200 op=UNLOAD Jan 21 00:59:02.793000 audit[4268]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7fff99401e90 a2=0 a3=2 items=0 ppid=4119 pid=4268 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:02.793000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 21 00:59:02.793000 audit: BPF prog-id=201 op=LOAD Jan 21 00:59:02.793000 audit[4268]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7fff99401f90 a2=94 a3=30 items=0 ppid=4119 pid=4268 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:02.793000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 21 00:59:02.807000 audit: BPF prog-id=202 op=LOAD Jan 21 00:59:02.807000 audit[4274]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffe169d0560 a2=98 a3=0 items=0 ppid=4119 pid=4274 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:02.807000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 21 00:59:02.807000 audit: BPF prog-id=202 op=UNLOAD Jan 21 00:59:02.807000 audit[4274]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7ffe169d0530 a3=0 items=0 ppid=4119 pid=4274 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:02.807000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 21 00:59:02.807000 audit: BPF prog-id=203 op=LOAD Jan 21 00:59:02.807000 audit[4274]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffe169d0350 a2=94 a3=54428f items=0 ppid=4119 pid=4274 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:02.807000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 21 00:59:02.807000 audit: BPF prog-id=203 op=UNLOAD Jan 21 00:59:02.807000 audit[4274]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffe169d0350 a2=94 a3=54428f items=0 ppid=4119 pid=4274 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:02.807000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 21 00:59:02.808000 audit: BPF prog-id=204 op=LOAD Jan 21 00:59:02.808000 audit[4274]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffe169d0380 a2=94 a3=2 items=0 ppid=4119 pid=4274 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:02.808000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 21 00:59:02.808000 audit: BPF prog-id=204 op=UNLOAD Jan 21 00:59:02.808000 audit[4274]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffe169d0380 a2=0 a3=2 items=0 ppid=4119 pid=4274 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:02.808000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 21 00:59:02.964000 audit: BPF prog-id=205 op=LOAD Jan 21 00:59:02.964000 audit[4274]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffe169d0240 a2=94 a3=1 items=0 ppid=4119 pid=4274 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:02.964000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 21 00:59:02.964000 audit: BPF prog-id=205 op=UNLOAD Jan 21 00:59:02.964000 audit[4274]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffe169d0240 a2=94 a3=1 items=0 ppid=4119 pid=4274 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:02.964000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 21 00:59:02.974000 audit: BPF prog-id=206 op=LOAD Jan 21 00:59:02.974000 audit[4274]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffe169d0230 a2=94 a3=4 items=0 ppid=4119 pid=4274 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:02.974000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 21 00:59:02.974000 audit: BPF prog-id=206 op=UNLOAD Jan 21 00:59:02.974000 audit[4274]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=7ffe169d0230 a2=0 a3=4 items=0 ppid=4119 pid=4274 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:02.974000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 21 00:59:02.974000 audit: BPF prog-id=207 op=LOAD Jan 21 00:59:02.974000 audit[4274]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffe169d0090 a2=94 a3=5 items=0 ppid=4119 pid=4274 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:02.974000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 21 00:59:02.974000 audit: BPF prog-id=207 op=UNLOAD Jan 21 00:59:02.974000 audit[4274]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7ffe169d0090 a2=0 a3=5 items=0 ppid=4119 pid=4274 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:02.974000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 21 00:59:02.975000 audit: BPF prog-id=208 op=LOAD Jan 21 00:59:02.975000 audit[4274]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffe169d02b0 a2=94 a3=6 items=0 ppid=4119 pid=4274 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:02.975000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 21 00:59:02.975000 audit: BPF prog-id=208 op=UNLOAD Jan 21 00:59:02.975000 audit[4274]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=7ffe169d02b0 a2=0 a3=6 items=0 ppid=4119 pid=4274 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:02.975000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 21 00:59:02.975000 audit: BPF prog-id=209 op=LOAD Jan 21 00:59:02.975000 audit[4274]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffe169cfa60 a2=94 a3=88 items=0 ppid=4119 pid=4274 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:02.975000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 21 00:59:02.975000 audit: BPF prog-id=210 op=LOAD Jan 21 00:59:02.975000 audit[4274]: SYSCALL arch=c000003e syscall=321 success=yes exit=7 a0=5 a1=7ffe169cf8e0 a2=94 a3=2 items=0 ppid=4119 pid=4274 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:02.975000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 21 00:59:02.975000 audit: BPF prog-id=210 op=UNLOAD Jan 21 00:59:02.975000 audit[4274]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=7 a1=7ffe169cf910 a2=0 a3=7ffe169cfa10 items=0 ppid=4119 pid=4274 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:02.975000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 21 00:59:02.975000 audit: BPF prog-id=209 op=UNLOAD Jan 21 00:59:02.975000 audit[4274]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=150a8d10 a2=0 a3=2be72bca7bf2048b items=0 ppid=4119 pid=4274 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:02.975000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 21 00:59:02.979000 audit: BPF prog-id=201 op=UNLOAD Jan 21 00:59:02.979000 audit[4119]: SYSCALL arch=c000003e syscall=263 success=yes exit=0 a0=ffffffffffffff9c a1=c000972240 a2=0 a3=0 items=0 ppid=4101 pid=4119 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="calico-node" exe="/usr/bin/calico-node" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:02.979000 audit: PROCTITLE proctitle=63616C69636F2D6E6F6465002D66656C6978 Jan 21 00:59:03.028000 audit[4297]: NETFILTER_CFG table=nat:119 family=2 entries=15 op=nft_register_chain pid=4297 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 21 00:59:03.028000 audit[4297]: SYSCALL arch=c000003e syscall=46 success=yes exit=5084 a0=3 a1=7ffd78b82fd0 a2=0 a3=7ffd78b82fbc items=0 ppid=4119 pid=4297 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:03.028000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 21 00:59:03.030000 audit[4298]: NETFILTER_CFG table=mangle:120 family=2 entries=16 op=nft_register_chain pid=4298 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 21 00:59:03.030000 audit[4298]: SYSCALL arch=c000003e syscall=46 success=yes exit=6868 a0=3 a1=7ffc2a246010 a2=0 a3=7ffc2a245ffc items=0 ppid=4119 pid=4298 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:03.030000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 21 00:59:03.031000 audit[4291]: NETFILTER_CFG table=raw:121 family=2 entries=21 op=nft_register_chain pid=4291 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 21 00:59:03.031000 audit[4291]: SYSCALL arch=c000003e syscall=46 success=yes exit=8452 a0=3 a1=7ffe9f116fe0 a2=0 a3=7ffe9f116fcc items=0 ppid=4119 pid=4291 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:03.031000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 21 00:59:03.042000 audit[4301]: NETFILTER_CFG table=filter:122 family=2 entries=94 op=nft_register_chain pid=4301 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 21 00:59:03.042000 audit[4301]: SYSCALL arch=c000003e syscall=46 success=yes exit=53116 a0=3 a1=7ffd2565f5c0 a2=0 a3=7ffd2565f5ac items=0 ppid=4119 pid=4301 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:03.042000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 21 00:59:03.212486 kubelet[2901]: I0121 00:59:03.212450 2901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4ce6349-e9dc-4bcd-9543-fc9564c69c1a" path="/var/lib/kubelet/pods/c4ce6349-e9dc-4bcd-9543-fc9564c69c1a/volumes" Jan 21 00:59:03.312401 kubelet[2901]: E0121 00:59:03.311721 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-64874dbd99-cd26r" podUID="9f88865d-a485-4a8a-b0f8-118c9593a73c" Jan 21 00:59:03.352000 audit[4311]: NETFILTER_CFG table=filter:123 family=2 entries=20 op=nft_register_rule pid=4311 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 21 00:59:03.352000 audit[4311]: SYSCALL arch=c000003e syscall=46 success=yes exit=7480 a0=3 a1=7ffc9d669320 a2=0 a3=7ffc9d66930c items=0 ppid=3056 pid=4311 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:03.352000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 21 00:59:03.359000 audit[4311]: NETFILTER_CFG table=nat:124 family=2 entries=14 op=nft_register_rule pid=4311 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 21 00:59:03.359000 audit[4311]: SYSCALL arch=c000003e syscall=46 success=yes exit=3468 a0=3 a1=7ffc9d669320 a2=0 a3=0 items=0 ppid=3056 pid=4311 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:03.359000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 21 00:59:03.756299 systemd-networkd[1587]: cali71d7cf119ac: Gained IPv6LL Jan 21 00:59:04.716361 systemd-networkd[1587]: vxlan.calico: Gained IPv6LL Jan 21 00:59:06.211009 containerd[1680]: time="2026-01-21T00:59:06.210934768Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-gqzzb,Uid:505024bb-cdb3-4e80-9c79-74499326a9e9,Namespace:kube-system,Attempt:0,}" Jan 21 00:59:06.331472 systemd-networkd[1587]: calia400dbe04ef: Link UP Jan 21 00:59:06.331611 systemd-networkd[1587]: calia400dbe04ef: Gained carrier Jan 21 00:59:06.348112 containerd[1680]: 2026-01-21 00:59:06.251 [INFO][4314] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--n--af1f1f5a24-k8s-coredns--668d6bf9bc--gqzzb-eth0 coredns-668d6bf9bc- kube-system 505024bb-cdb3-4e80-9c79-74499326a9e9 780 0 2026-01-21 00:58:31 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4547-0-0-n-af1f1f5a24 coredns-668d6bf9bc-gqzzb eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calia400dbe04ef [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="c52b758e4095394b8e06a404126f0ddeb1ade5f80e8595a2ec87481075cb20c7" Namespace="kube-system" Pod="coredns-668d6bf9bc-gqzzb" WorkloadEndpoint="ci--4547--0--0--n--af1f1f5a24-k8s-coredns--668d6bf9bc--gqzzb-" Jan 21 00:59:06.348112 containerd[1680]: 2026-01-21 00:59:06.251 [INFO][4314] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c52b758e4095394b8e06a404126f0ddeb1ade5f80e8595a2ec87481075cb20c7" Namespace="kube-system" Pod="coredns-668d6bf9bc-gqzzb" WorkloadEndpoint="ci--4547--0--0--n--af1f1f5a24-k8s-coredns--668d6bf9bc--gqzzb-eth0" Jan 21 00:59:06.348112 containerd[1680]: 2026-01-21 00:59:06.288 [INFO][4326] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c52b758e4095394b8e06a404126f0ddeb1ade5f80e8595a2ec87481075cb20c7" HandleID="k8s-pod-network.c52b758e4095394b8e06a404126f0ddeb1ade5f80e8595a2ec87481075cb20c7" Workload="ci--4547--0--0--n--af1f1f5a24-k8s-coredns--668d6bf9bc--gqzzb-eth0" Jan 21 00:59:06.348825 containerd[1680]: 2026-01-21 00:59:06.288 [INFO][4326] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="c52b758e4095394b8e06a404126f0ddeb1ade5f80e8595a2ec87481075cb20c7" HandleID="k8s-pod-network.c52b758e4095394b8e06a404126f0ddeb1ade5f80e8595a2ec87481075cb20c7" Workload="ci--4547--0--0--n--af1f1f5a24-k8s-coredns--668d6bf9bc--gqzzb-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002cf2a0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4547-0-0-n-af1f1f5a24", "pod":"coredns-668d6bf9bc-gqzzb", "timestamp":"2026-01-21 00:59:06.288442127 +0000 UTC"}, Hostname:"ci-4547-0-0-n-af1f1f5a24", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 21 00:59:06.348825 containerd[1680]: 2026-01-21 00:59:06.288 [INFO][4326] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 21 00:59:06.348825 containerd[1680]: 2026-01-21 00:59:06.288 [INFO][4326] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 21 00:59:06.348825 containerd[1680]: 2026-01-21 00:59:06.288 [INFO][4326] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-n-af1f1f5a24' Jan 21 00:59:06.348825 containerd[1680]: 2026-01-21 00:59:06.296 [INFO][4326] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.c52b758e4095394b8e06a404126f0ddeb1ade5f80e8595a2ec87481075cb20c7" host="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:59:06.348825 containerd[1680]: 2026-01-21 00:59:06.300 [INFO][4326] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:59:06.348825 containerd[1680]: 2026-01-21 00:59:06.304 [INFO][4326] ipam/ipam.go 511: Trying affinity for 192.168.74.0/26 host="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:59:06.348825 containerd[1680]: 2026-01-21 00:59:06.306 [INFO][4326] ipam/ipam.go 158: Attempting to load block cidr=192.168.74.0/26 host="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:59:06.348825 containerd[1680]: 2026-01-21 00:59:06.308 [INFO][4326] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.74.0/26 host="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:59:06.349231 containerd[1680]: 2026-01-21 00:59:06.308 [INFO][4326] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.74.0/26 handle="k8s-pod-network.c52b758e4095394b8e06a404126f0ddeb1ade5f80e8595a2ec87481075cb20c7" host="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:59:06.349231 containerd[1680]: 2026-01-21 00:59:06.312 [INFO][4326] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.c52b758e4095394b8e06a404126f0ddeb1ade5f80e8595a2ec87481075cb20c7 Jan 21 00:59:06.349231 containerd[1680]: 2026-01-21 00:59:06.319 [INFO][4326] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.74.0/26 handle="k8s-pod-network.c52b758e4095394b8e06a404126f0ddeb1ade5f80e8595a2ec87481075cb20c7" host="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:59:06.349231 containerd[1680]: 2026-01-21 00:59:06.326 [INFO][4326] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.74.2/26] block=192.168.74.0/26 handle="k8s-pod-network.c52b758e4095394b8e06a404126f0ddeb1ade5f80e8595a2ec87481075cb20c7" host="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:59:06.349231 containerd[1680]: 2026-01-21 00:59:06.327 [INFO][4326] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.74.2/26] handle="k8s-pod-network.c52b758e4095394b8e06a404126f0ddeb1ade5f80e8595a2ec87481075cb20c7" host="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:59:06.349231 containerd[1680]: 2026-01-21 00:59:06.327 [INFO][4326] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 21 00:59:06.349231 containerd[1680]: 2026-01-21 00:59:06.327 [INFO][4326] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.74.2/26] IPv6=[] ContainerID="c52b758e4095394b8e06a404126f0ddeb1ade5f80e8595a2ec87481075cb20c7" HandleID="k8s-pod-network.c52b758e4095394b8e06a404126f0ddeb1ade5f80e8595a2ec87481075cb20c7" Workload="ci--4547--0--0--n--af1f1f5a24-k8s-coredns--668d6bf9bc--gqzzb-eth0" Jan 21 00:59:06.349536 containerd[1680]: 2026-01-21 00:59:06.329 [INFO][4314] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c52b758e4095394b8e06a404126f0ddeb1ade5f80e8595a2ec87481075cb20c7" Namespace="kube-system" Pod="coredns-668d6bf9bc-gqzzb" WorkloadEndpoint="ci--4547--0--0--n--af1f1f5a24-k8s-coredns--668d6bf9bc--gqzzb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--n--af1f1f5a24-k8s-coredns--668d6bf9bc--gqzzb-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"505024bb-cdb3-4e80-9c79-74499326a9e9", ResourceVersion:"780", Generation:0, CreationTimestamp:time.Date(2026, time.January, 21, 0, 58, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-n-af1f1f5a24", ContainerID:"", Pod:"coredns-668d6bf9bc-gqzzb", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.74.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calia400dbe04ef", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 21 00:59:06.349536 containerd[1680]: 2026-01-21 00:59:06.329 [INFO][4314] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.74.2/32] ContainerID="c52b758e4095394b8e06a404126f0ddeb1ade5f80e8595a2ec87481075cb20c7" Namespace="kube-system" Pod="coredns-668d6bf9bc-gqzzb" WorkloadEndpoint="ci--4547--0--0--n--af1f1f5a24-k8s-coredns--668d6bf9bc--gqzzb-eth0" Jan 21 00:59:06.349536 containerd[1680]: 2026-01-21 00:59:06.329 [INFO][4314] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia400dbe04ef ContainerID="c52b758e4095394b8e06a404126f0ddeb1ade5f80e8595a2ec87481075cb20c7" Namespace="kube-system" Pod="coredns-668d6bf9bc-gqzzb" WorkloadEndpoint="ci--4547--0--0--n--af1f1f5a24-k8s-coredns--668d6bf9bc--gqzzb-eth0" Jan 21 00:59:06.349536 containerd[1680]: 2026-01-21 00:59:06.331 [INFO][4314] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c52b758e4095394b8e06a404126f0ddeb1ade5f80e8595a2ec87481075cb20c7" Namespace="kube-system" Pod="coredns-668d6bf9bc-gqzzb" WorkloadEndpoint="ci--4547--0--0--n--af1f1f5a24-k8s-coredns--668d6bf9bc--gqzzb-eth0" Jan 21 00:59:06.349536 containerd[1680]: 2026-01-21 00:59:06.332 [INFO][4314] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c52b758e4095394b8e06a404126f0ddeb1ade5f80e8595a2ec87481075cb20c7" Namespace="kube-system" Pod="coredns-668d6bf9bc-gqzzb" WorkloadEndpoint="ci--4547--0--0--n--af1f1f5a24-k8s-coredns--668d6bf9bc--gqzzb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--n--af1f1f5a24-k8s-coredns--668d6bf9bc--gqzzb-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"505024bb-cdb3-4e80-9c79-74499326a9e9", ResourceVersion:"780", Generation:0, CreationTimestamp:time.Date(2026, time.January, 21, 0, 58, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-n-af1f1f5a24", ContainerID:"c52b758e4095394b8e06a404126f0ddeb1ade5f80e8595a2ec87481075cb20c7", Pod:"coredns-668d6bf9bc-gqzzb", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.74.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calia400dbe04ef", MAC:"7a:dd:d2:93:32:3c", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 21 00:59:06.349536 containerd[1680]: 2026-01-21 00:59:06.341 [INFO][4314] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c52b758e4095394b8e06a404126f0ddeb1ade5f80e8595a2ec87481075cb20c7" Namespace="kube-system" Pod="coredns-668d6bf9bc-gqzzb" WorkloadEndpoint="ci--4547--0--0--n--af1f1f5a24-k8s-coredns--668d6bf9bc--gqzzb-eth0" Jan 21 00:59:06.365432 kernel: kauditd_printk_skb: 237 callbacks suppressed Jan 21 00:59:06.365559 kernel: audit: type=1325 audit(1768957146.361:656): table=filter:125 family=2 entries=42 op=nft_register_chain pid=4344 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 21 00:59:06.361000 audit[4344]: NETFILTER_CFG table=filter:125 family=2 entries=42 op=nft_register_chain pid=4344 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 21 00:59:06.372711 kernel: audit: type=1300 audit(1768957146.361:656): arch=c000003e syscall=46 success=yes exit=22552 a0=3 a1=7ffd1d97f040 a2=0 a3=7ffd1d97f02c items=0 ppid=4119 pid=4344 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:06.361000 audit[4344]: SYSCALL arch=c000003e syscall=46 success=yes exit=22552 a0=3 a1=7ffd1d97f040 a2=0 a3=7ffd1d97f02c items=0 ppid=4119 pid=4344 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:06.361000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 21 00:59:06.377827 kernel: audit: type=1327 audit(1768957146.361:656): proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 21 00:59:06.387712 containerd[1680]: time="2026-01-21T00:59:06.387601452Z" level=info msg="connecting to shim c52b758e4095394b8e06a404126f0ddeb1ade5f80e8595a2ec87481075cb20c7" address="unix:///run/containerd/s/1894f11168b1de8b7261ea615da327e27c9db015b3feefadce9ac10edfa7aac7" namespace=k8s.io protocol=ttrpc version=3 Jan 21 00:59:06.407870 systemd[1]: Started cri-containerd-c52b758e4095394b8e06a404126f0ddeb1ade5f80e8595a2ec87481075cb20c7.scope - libcontainer container c52b758e4095394b8e06a404126f0ddeb1ade5f80e8595a2ec87481075cb20c7. Jan 21 00:59:06.416000 audit: BPF prog-id=211 op=LOAD Jan 21 00:59:06.418000 audit: BPF prog-id=212 op=LOAD Jan 21 00:59:06.420179 kernel: audit: type=1334 audit(1768957146.416:657): prog-id=211 op=LOAD Jan 21 00:59:06.420264 kernel: audit: type=1334 audit(1768957146.418:658): prog-id=212 op=LOAD Jan 21 00:59:06.420282 kernel: audit: type=1300 audit(1768957146.418:658): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=4354 pid=4365 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:06.418000 audit[4365]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=4354 pid=4365 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:06.418000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6335326237353865343039353339346238653036613430343132366630 Jan 21 00:59:06.424773 kernel: audit: type=1327 audit(1768957146.418:658): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6335326237353865343039353339346238653036613430343132366630 Jan 21 00:59:06.418000 audit: BPF prog-id=212 op=UNLOAD Jan 21 00:59:06.428207 kernel: audit: type=1334 audit(1768957146.418:659): prog-id=212 op=UNLOAD Jan 21 00:59:06.428260 kernel: audit: type=1300 audit(1768957146.418:659): arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4354 pid=4365 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:06.418000 audit[4365]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4354 pid=4365 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:06.418000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6335326237353865343039353339346238653036613430343132366630 Jan 21 00:59:06.432936 kernel: audit: type=1327 audit(1768957146.418:659): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6335326237353865343039353339346238653036613430343132366630 Jan 21 00:59:06.418000 audit: BPF prog-id=213 op=LOAD Jan 21 00:59:06.418000 audit[4365]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=4354 pid=4365 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:06.418000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6335326237353865343039353339346238653036613430343132366630 Jan 21 00:59:06.418000 audit: BPF prog-id=214 op=LOAD Jan 21 00:59:06.418000 audit[4365]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=4354 pid=4365 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:06.418000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6335326237353865343039353339346238653036613430343132366630 Jan 21 00:59:06.418000 audit: BPF prog-id=214 op=UNLOAD Jan 21 00:59:06.418000 audit[4365]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4354 pid=4365 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:06.418000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6335326237353865343039353339346238653036613430343132366630 Jan 21 00:59:06.418000 audit: BPF prog-id=213 op=UNLOAD Jan 21 00:59:06.418000 audit[4365]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4354 pid=4365 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:06.418000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6335326237353865343039353339346238653036613430343132366630 Jan 21 00:59:06.418000 audit: BPF prog-id=215 op=LOAD Jan 21 00:59:06.418000 audit[4365]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=4354 pid=4365 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:06.418000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6335326237353865343039353339346238653036613430343132366630 Jan 21 00:59:06.462896 containerd[1680]: time="2026-01-21T00:59:06.462756100Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-gqzzb,Uid:505024bb-cdb3-4e80-9c79-74499326a9e9,Namespace:kube-system,Attempt:0,} returns sandbox id \"c52b758e4095394b8e06a404126f0ddeb1ade5f80e8595a2ec87481075cb20c7\"" Jan 21 00:59:06.465361 containerd[1680]: time="2026-01-21T00:59:06.465312261Z" level=info msg="CreateContainer within sandbox \"c52b758e4095394b8e06a404126f0ddeb1ade5f80e8595a2ec87481075cb20c7\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 21 00:59:06.479702 containerd[1680]: time="2026-01-21T00:59:06.478084785Z" level=info msg="Container 1b4dd5b25451d05d671fc79e33de93eb3c080530c94a5d01ef22b37ddb26f5ab: CDI devices from CRI Config.CDIDevices: []" Jan 21 00:59:06.483244 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1170593030.mount: Deactivated successfully. Jan 21 00:59:06.489541 containerd[1680]: time="2026-01-21T00:59:06.489489205Z" level=info msg="CreateContainer within sandbox \"c52b758e4095394b8e06a404126f0ddeb1ade5f80e8595a2ec87481075cb20c7\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"1b4dd5b25451d05d671fc79e33de93eb3c080530c94a5d01ef22b37ddb26f5ab\"" Jan 21 00:59:06.490112 containerd[1680]: time="2026-01-21T00:59:06.490080001Z" level=info msg="StartContainer for \"1b4dd5b25451d05d671fc79e33de93eb3c080530c94a5d01ef22b37ddb26f5ab\"" Jan 21 00:59:06.492038 containerd[1680]: time="2026-01-21T00:59:06.491963183Z" level=info msg="connecting to shim 1b4dd5b25451d05d671fc79e33de93eb3c080530c94a5d01ef22b37ddb26f5ab" address="unix:///run/containerd/s/1894f11168b1de8b7261ea615da327e27c9db015b3feefadce9ac10edfa7aac7" protocol=ttrpc version=3 Jan 21 00:59:06.512096 systemd[1]: Started cri-containerd-1b4dd5b25451d05d671fc79e33de93eb3c080530c94a5d01ef22b37ddb26f5ab.scope - libcontainer container 1b4dd5b25451d05d671fc79e33de93eb3c080530c94a5d01ef22b37ddb26f5ab. Jan 21 00:59:06.522000 audit: BPF prog-id=216 op=LOAD Jan 21 00:59:06.524000 audit: BPF prog-id=217 op=LOAD Jan 21 00:59:06.524000 audit[4391]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=4354 pid=4391 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:06.524000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3162346464356232353435316430356436373166633739653333646539 Jan 21 00:59:06.524000 audit: BPF prog-id=217 op=UNLOAD Jan 21 00:59:06.524000 audit[4391]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4354 pid=4391 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:06.524000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3162346464356232353435316430356436373166633739653333646539 Jan 21 00:59:06.524000 audit: BPF prog-id=218 op=LOAD Jan 21 00:59:06.524000 audit[4391]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=4354 pid=4391 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:06.524000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3162346464356232353435316430356436373166633739653333646539 Jan 21 00:59:06.524000 audit: BPF prog-id=219 op=LOAD Jan 21 00:59:06.524000 audit[4391]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=4354 pid=4391 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:06.524000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3162346464356232353435316430356436373166633739653333646539 Jan 21 00:59:06.524000 audit: BPF prog-id=219 op=UNLOAD Jan 21 00:59:06.524000 audit[4391]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4354 pid=4391 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:06.524000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3162346464356232353435316430356436373166633739653333646539 Jan 21 00:59:06.524000 audit: BPF prog-id=218 op=UNLOAD Jan 21 00:59:06.524000 audit[4391]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4354 pid=4391 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:06.524000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3162346464356232353435316430356436373166633739653333646539 Jan 21 00:59:06.524000 audit: BPF prog-id=220 op=LOAD Jan 21 00:59:06.524000 audit[4391]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=4354 pid=4391 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:06.524000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3162346464356232353435316430356436373166633739653333646539 Jan 21 00:59:06.545397 containerd[1680]: time="2026-01-21T00:59:06.545308993Z" level=info msg="StartContainer for \"1b4dd5b25451d05d671fc79e33de93eb3c080530c94a5d01ef22b37ddb26f5ab\" returns successfully" Jan 21 00:59:07.217365 containerd[1680]: time="2026-01-21T00:59:07.217323419Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-bc7877fd9-wf2hg,Uid:59ce7fac-1a6e-4ec4-b99e-063ed3e3444c,Namespace:calico-system,Attempt:0,}" Jan 21 00:59:07.220719 containerd[1680]: time="2026-01-21T00:59:07.220697719Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-54cbb8896b-hxx9z,Uid:22543a80-9d55-4110-b0da-aa35bd7688e7,Namespace:calico-apiserver,Attempt:0,}" Jan 21 00:59:07.354162 systemd-networkd[1587]: cali6693bac0e8a: Link UP Jan 21 00:59:07.356186 systemd-networkd[1587]: cali6693bac0e8a: Gained carrier Jan 21 00:59:07.364739 kubelet[2901]: I0121 00:59:07.364651 2901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-gqzzb" podStartSLOduration=36.364636155 podStartE2EDuration="36.364636155s" podCreationTimestamp="2026-01-21 00:58:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:59:07.336152508 +0000 UTC m=+42.276768126" watchObservedRunningTime="2026-01-21 00:59:07.364636155 +0000 UTC m=+42.305251776" Jan 21 00:59:07.370000 audit[4462]: NETFILTER_CFG table=filter:126 family=2 entries=20 op=nft_register_rule pid=4462 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 21 00:59:07.370000 audit[4462]: SYSCALL arch=c000003e syscall=46 success=yes exit=7480 a0=3 a1=7ffc070a15a0 a2=0 a3=7ffc070a158c items=0 ppid=3056 pid=4462 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:07.370000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 21 00:59:07.377159 containerd[1680]: 2026-01-21 00:59:07.270 [INFO][4424] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--n--af1f1f5a24-k8s-calico--kube--controllers--bc7877fd9--wf2hg-eth0 calico-kube-controllers-bc7877fd9- calico-system 59ce7fac-1a6e-4ec4-b99e-063ed3e3444c 787 0 2026-01-21 00:58:43 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:bc7877fd9 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4547-0-0-n-af1f1f5a24 calico-kube-controllers-bc7877fd9-wf2hg eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali6693bac0e8a [] [] }} ContainerID="d854e0f01e0a5fdd01a96244b0f9f2c382518e2232c7ce347d17e2a30175048b" Namespace="calico-system" Pod="calico-kube-controllers-bc7877fd9-wf2hg" WorkloadEndpoint="ci--4547--0--0--n--af1f1f5a24-k8s-calico--kube--controllers--bc7877fd9--wf2hg-" Jan 21 00:59:07.377159 containerd[1680]: 2026-01-21 00:59:07.270 [INFO][4424] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d854e0f01e0a5fdd01a96244b0f9f2c382518e2232c7ce347d17e2a30175048b" Namespace="calico-system" Pod="calico-kube-controllers-bc7877fd9-wf2hg" WorkloadEndpoint="ci--4547--0--0--n--af1f1f5a24-k8s-calico--kube--controllers--bc7877fd9--wf2hg-eth0" Jan 21 00:59:07.377159 containerd[1680]: 2026-01-21 00:59:07.304 [INFO][4448] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d854e0f01e0a5fdd01a96244b0f9f2c382518e2232c7ce347d17e2a30175048b" HandleID="k8s-pod-network.d854e0f01e0a5fdd01a96244b0f9f2c382518e2232c7ce347d17e2a30175048b" Workload="ci--4547--0--0--n--af1f1f5a24-k8s-calico--kube--controllers--bc7877fd9--wf2hg-eth0" Jan 21 00:59:07.377159 containerd[1680]: 2026-01-21 00:59:07.304 [INFO][4448] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="d854e0f01e0a5fdd01a96244b0f9f2c382518e2232c7ce347d17e2a30175048b" HandleID="k8s-pod-network.d854e0f01e0a5fdd01a96244b0f9f2c382518e2232c7ce347d17e2a30175048b" Workload="ci--4547--0--0--n--af1f1f5a24-k8s-calico--kube--controllers--bc7877fd9--wf2hg-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5cc0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4547-0-0-n-af1f1f5a24", "pod":"calico-kube-controllers-bc7877fd9-wf2hg", "timestamp":"2026-01-21 00:59:07.304138679 +0000 UTC"}, Hostname:"ci-4547-0-0-n-af1f1f5a24", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 21 00:59:07.377159 containerd[1680]: 2026-01-21 00:59:07.305 [INFO][4448] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 21 00:59:07.377159 containerd[1680]: 2026-01-21 00:59:07.305 [INFO][4448] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 21 00:59:07.377159 containerd[1680]: 2026-01-21 00:59:07.305 [INFO][4448] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-n-af1f1f5a24' Jan 21 00:59:07.377159 containerd[1680]: 2026-01-21 00:59:07.313 [INFO][4448] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.d854e0f01e0a5fdd01a96244b0f9f2c382518e2232c7ce347d17e2a30175048b" host="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:59:07.377159 containerd[1680]: 2026-01-21 00:59:07.317 [INFO][4448] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:59:07.377159 containerd[1680]: 2026-01-21 00:59:07.322 [INFO][4448] ipam/ipam.go 511: Trying affinity for 192.168.74.0/26 host="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:59:07.377159 containerd[1680]: 2026-01-21 00:59:07.324 [INFO][4448] ipam/ipam.go 158: Attempting to load block cidr=192.168.74.0/26 host="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:59:07.377159 containerd[1680]: 2026-01-21 00:59:07.326 [INFO][4448] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.74.0/26 host="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:59:07.377159 containerd[1680]: 2026-01-21 00:59:07.326 [INFO][4448] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.74.0/26 handle="k8s-pod-network.d854e0f01e0a5fdd01a96244b0f9f2c382518e2232c7ce347d17e2a30175048b" host="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:59:07.377159 containerd[1680]: 2026-01-21 00:59:07.327 [INFO][4448] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.d854e0f01e0a5fdd01a96244b0f9f2c382518e2232c7ce347d17e2a30175048b Jan 21 00:59:07.377159 containerd[1680]: 2026-01-21 00:59:07.331 [INFO][4448] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.74.0/26 handle="k8s-pod-network.d854e0f01e0a5fdd01a96244b0f9f2c382518e2232c7ce347d17e2a30175048b" host="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:59:07.377159 containerd[1680]: 2026-01-21 00:59:07.344 [INFO][4448] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.74.3/26] block=192.168.74.0/26 handle="k8s-pod-network.d854e0f01e0a5fdd01a96244b0f9f2c382518e2232c7ce347d17e2a30175048b" host="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:59:07.377159 containerd[1680]: 2026-01-21 00:59:07.344 [INFO][4448] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.74.3/26] handle="k8s-pod-network.d854e0f01e0a5fdd01a96244b0f9f2c382518e2232c7ce347d17e2a30175048b" host="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:59:07.377159 containerd[1680]: 2026-01-21 00:59:07.344 [INFO][4448] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 21 00:59:07.377159 containerd[1680]: 2026-01-21 00:59:07.344 [INFO][4448] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.74.3/26] IPv6=[] ContainerID="d854e0f01e0a5fdd01a96244b0f9f2c382518e2232c7ce347d17e2a30175048b" HandleID="k8s-pod-network.d854e0f01e0a5fdd01a96244b0f9f2c382518e2232c7ce347d17e2a30175048b" Workload="ci--4547--0--0--n--af1f1f5a24-k8s-calico--kube--controllers--bc7877fd9--wf2hg-eth0" Jan 21 00:59:07.378061 containerd[1680]: 2026-01-21 00:59:07.349 [INFO][4424] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d854e0f01e0a5fdd01a96244b0f9f2c382518e2232c7ce347d17e2a30175048b" Namespace="calico-system" Pod="calico-kube-controllers-bc7877fd9-wf2hg" WorkloadEndpoint="ci--4547--0--0--n--af1f1f5a24-k8s-calico--kube--controllers--bc7877fd9--wf2hg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--n--af1f1f5a24-k8s-calico--kube--controllers--bc7877fd9--wf2hg-eth0", GenerateName:"calico-kube-controllers-bc7877fd9-", Namespace:"calico-system", SelfLink:"", UID:"59ce7fac-1a6e-4ec4-b99e-063ed3e3444c", ResourceVersion:"787", Generation:0, CreationTimestamp:time.Date(2026, time.January, 21, 0, 58, 43, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"bc7877fd9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-n-af1f1f5a24", ContainerID:"", Pod:"calico-kube-controllers-bc7877fd9-wf2hg", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.74.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali6693bac0e8a", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 21 00:59:07.378061 containerd[1680]: 2026-01-21 00:59:07.349 [INFO][4424] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.74.3/32] ContainerID="d854e0f01e0a5fdd01a96244b0f9f2c382518e2232c7ce347d17e2a30175048b" Namespace="calico-system" Pod="calico-kube-controllers-bc7877fd9-wf2hg" WorkloadEndpoint="ci--4547--0--0--n--af1f1f5a24-k8s-calico--kube--controllers--bc7877fd9--wf2hg-eth0" Jan 21 00:59:07.378061 containerd[1680]: 2026-01-21 00:59:07.349 [INFO][4424] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali6693bac0e8a ContainerID="d854e0f01e0a5fdd01a96244b0f9f2c382518e2232c7ce347d17e2a30175048b" Namespace="calico-system" Pod="calico-kube-controllers-bc7877fd9-wf2hg" WorkloadEndpoint="ci--4547--0--0--n--af1f1f5a24-k8s-calico--kube--controllers--bc7877fd9--wf2hg-eth0" Jan 21 00:59:07.378061 containerd[1680]: 2026-01-21 00:59:07.356 [INFO][4424] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d854e0f01e0a5fdd01a96244b0f9f2c382518e2232c7ce347d17e2a30175048b" Namespace="calico-system" Pod="calico-kube-controllers-bc7877fd9-wf2hg" WorkloadEndpoint="ci--4547--0--0--n--af1f1f5a24-k8s-calico--kube--controllers--bc7877fd9--wf2hg-eth0" Jan 21 00:59:07.378061 containerd[1680]: 2026-01-21 00:59:07.356 [INFO][4424] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d854e0f01e0a5fdd01a96244b0f9f2c382518e2232c7ce347d17e2a30175048b" Namespace="calico-system" Pod="calico-kube-controllers-bc7877fd9-wf2hg" WorkloadEndpoint="ci--4547--0--0--n--af1f1f5a24-k8s-calico--kube--controllers--bc7877fd9--wf2hg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--n--af1f1f5a24-k8s-calico--kube--controllers--bc7877fd9--wf2hg-eth0", GenerateName:"calico-kube-controllers-bc7877fd9-", Namespace:"calico-system", SelfLink:"", UID:"59ce7fac-1a6e-4ec4-b99e-063ed3e3444c", ResourceVersion:"787", Generation:0, CreationTimestamp:time.Date(2026, time.January, 21, 0, 58, 43, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"bc7877fd9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-n-af1f1f5a24", ContainerID:"d854e0f01e0a5fdd01a96244b0f9f2c382518e2232c7ce347d17e2a30175048b", Pod:"calico-kube-controllers-bc7877fd9-wf2hg", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.74.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali6693bac0e8a", MAC:"42:c1:92:88:53:3b", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 21 00:59:07.378061 containerd[1680]: 2026-01-21 00:59:07.373 [INFO][4424] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d854e0f01e0a5fdd01a96244b0f9f2c382518e2232c7ce347d17e2a30175048b" Namespace="calico-system" Pod="calico-kube-controllers-bc7877fd9-wf2hg" WorkloadEndpoint="ci--4547--0--0--n--af1f1f5a24-k8s-calico--kube--controllers--bc7877fd9--wf2hg-eth0" Jan 21 00:59:07.383000 audit[4462]: NETFILTER_CFG table=nat:127 family=2 entries=14 op=nft_register_rule pid=4462 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 21 00:59:07.383000 audit[4462]: SYSCALL arch=c000003e syscall=46 success=yes exit=3468 a0=3 a1=7ffc070a15a0 a2=0 a3=0 items=0 ppid=3056 pid=4462 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:07.383000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 21 00:59:07.411000 audit[4471]: NETFILTER_CFG table=filter:128 family=2 entries=40 op=nft_register_chain pid=4471 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 21 00:59:07.411000 audit[4471]: SYSCALL arch=c000003e syscall=46 success=yes exit=20764 a0=3 a1=7ffd5644daf0 a2=0 a3=7ffd5644dadc items=0 ppid=4119 pid=4471 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:07.411000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 21 00:59:07.426230 containerd[1680]: time="2026-01-21T00:59:07.426110179Z" level=info msg="connecting to shim d854e0f01e0a5fdd01a96244b0f9f2c382518e2232c7ce347d17e2a30175048b" address="unix:///run/containerd/s/1ec0d151b6fc0243be3ead97e92d67cbf9c4f045b996b45d40c52b51c9b8ed14" namespace=k8s.io protocol=ttrpc version=3 Jan 21 00:59:07.426000 audit[4475]: NETFILTER_CFG table=filter:129 family=2 entries=17 op=nft_register_rule pid=4475 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 21 00:59:07.426000 audit[4475]: SYSCALL arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7ffecaeffce0 a2=0 a3=7ffecaeffccc items=0 ppid=3056 pid=4475 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:07.426000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 21 00:59:07.430000 audit[4475]: NETFILTER_CFG table=nat:130 family=2 entries=35 op=nft_register_chain pid=4475 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 21 00:59:07.430000 audit[4475]: SYSCALL arch=c000003e syscall=46 success=yes exit=14196 a0=3 a1=7ffecaeffce0 a2=0 a3=7ffecaeffccc items=0 ppid=3056 pid=4475 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:07.430000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 21 00:59:07.460884 systemd[1]: Started cri-containerd-d854e0f01e0a5fdd01a96244b0f9f2c382518e2232c7ce347d17e2a30175048b.scope - libcontainer container d854e0f01e0a5fdd01a96244b0f9f2c382518e2232c7ce347d17e2a30175048b. Jan 21 00:59:07.475244 systemd-networkd[1587]: cali32ec7d30da0: Link UP Jan 21 00:59:07.477763 systemd-networkd[1587]: cali32ec7d30da0: Gained carrier Jan 21 00:59:07.493000 audit: BPF prog-id=221 op=LOAD Jan 21 00:59:07.494000 audit: BPF prog-id=222 op=LOAD Jan 21 00:59:07.494000 audit[4493]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=4482 pid=4493 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:07.494000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6438353465306630316530613566646430316139363234346230663966 Jan 21 00:59:07.494000 audit: BPF prog-id=222 op=UNLOAD Jan 21 00:59:07.494000 audit[4493]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4482 pid=4493 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:07.494000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6438353465306630316530613566646430316139363234346230663966 Jan 21 00:59:07.494000 audit: BPF prog-id=223 op=LOAD Jan 21 00:59:07.494000 audit[4493]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=4482 pid=4493 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:07.494000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6438353465306630316530613566646430316139363234346230663966 Jan 21 00:59:07.494000 audit: BPF prog-id=224 op=LOAD Jan 21 00:59:07.494000 audit[4493]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=4482 pid=4493 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:07.494000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6438353465306630316530613566646430316139363234346230663966 Jan 21 00:59:07.494000 audit: BPF prog-id=224 op=UNLOAD Jan 21 00:59:07.494000 audit[4493]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=4482 pid=4493 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:07.494000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6438353465306630316530613566646430316139363234346230663966 Jan 21 00:59:07.495000 audit: BPF prog-id=223 op=UNLOAD Jan 21 00:59:07.495000 audit[4493]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4482 pid=4493 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:07.495000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6438353465306630316530613566646430316139363234346230663966 Jan 21 00:59:07.495000 audit: BPF prog-id=225 op=LOAD Jan 21 00:59:07.495000 audit[4493]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=4482 pid=4493 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:07.495000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6438353465306630316530613566646430316139363234346230663966 Jan 21 00:59:07.499998 containerd[1680]: 2026-01-21 00:59:07.274 [INFO][4434] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--n--af1f1f5a24-k8s-calico--apiserver--54cbb8896b--hxx9z-eth0 calico-apiserver-54cbb8896b- calico-apiserver 22543a80-9d55-4110-b0da-aa35bd7688e7 790 0 2026-01-21 00:58:40 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:54cbb8896b projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4547-0-0-n-af1f1f5a24 calico-apiserver-54cbb8896b-hxx9z eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali32ec7d30da0 [] [] }} ContainerID="b41a24779b154008d2112ea390adbba2d6266c99e501518124c1e4050e70139e" Namespace="calico-apiserver" Pod="calico-apiserver-54cbb8896b-hxx9z" WorkloadEndpoint="ci--4547--0--0--n--af1f1f5a24-k8s-calico--apiserver--54cbb8896b--hxx9z-" Jan 21 00:59:07.499998 containerd[1680]: 2026-01-21 00:59:07.274 [INFO][4434] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="b41a24779b154008d2112ea390adbba2d6266c99e501518124c1e4050e70139e" Namespace="calico-apiserver" Pod="calico-apiserver-54cbb8896b-hxx9z" WorkloadEndpoint="ci--4547--0--0--n--af1f1f5a24-k8s-calico--apiserver--54cbb8896b--hxx9z-eth0" Jan 21 00:59:07.499998 containerd[1680]: 2026-01-21 00:59:07.309 [INFO][4451] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b41a24779b154008d2112ea390adbba2d6266c99e501518124c1e4050e70139e" HandleID="k8s-pod-network.b41a24779b154008d2112ea390adbba2d6266c99e501518124c1e4050e70139e" Workload="ci--4547--0--0--n--af1f1f5a24-k8s-calico--apiserver--54cbb8896b--hxx9z-eth0" Jan 21 00:59:07.499998 containerd[1680]: 2026-01-21 00:59:07.309 [INFO][4451] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="b41a24779b154008d2112ea390adbba2d6266c99e501518124c1e4050e70139e" HandleID="k8s-pod-network.b41a24779b154008d2112ea390adbba2d6266c99e501518124c1e4050e70139e" Workload="ci--4547--0--0--n--af1f1f5a24-k8s-calico--apiserver--54cbb8896b--hxx9z-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024efe0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4547-0-0-n-af1f1f5a24", "pod":"calico-apiserver-54cbb8896b-hxx9z", "timestamp":"2026-01-21 00:59:07.309622901 +0000 UTC"}, Hostname:"ci-4547-0-0-n-af1f1f5a24", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 21 00:59:07.499998 containerd[1680]: 2026-01-21 00:59:07.309 [INFO][4451] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 21 00:59:07.499998 containerd[1680]: 2026-01-21 00:59:07.344 [INFO][4451] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 21 00:59:07.499998 containerd[1680]: 2026-01-21 00:59:07.344 [INFO][4451] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-n-af1f1f5a24' Jan 21 00:59:07.499998 containerd[1680]: 2026-01-21 00:59:07.417 [INFO][4451] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.b41a24779b154008d2112ea390adbba2d6266c99e501518124c1e4050e70139e" host="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:59:07.499998 containerd[1680]: 2026-01-21 00:59:07.425 [INFO][4451] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:59:07.499998 containerd[1680]: 2026-01-21 00:59:07.437 [INFO][4451] ipam/ipam.go 511: Trying affinity for 192.168.74.0/26 host="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:59:07.499998 containerd[1680]: 2026-01-21 00:59:07.443 [INFO][4451] ipam/ipam.go 158: Attempting to load block cidr=192.168.74.0/26 host="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:59:07.499998 containerd[1680]: 2026-01-21 00:59:07.445 [INFO][4451] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.74.0/26 host="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:59:07.499998 containerd[1680]: 2026-01-21 00:59:07.447 [INFO][4451] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.74.0/26 handle="k8s-pod-network.b41a24779b154008d2112ea390adbba2d6266c99e501518124c1e4050e70139e" host="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:59:07.499998 containerd[1680]: 2026-01-21 00:59:07.450 [INFO][4451] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.b41a24779b154008d2112ea390adbba2d6266c99e501518124c1e4050e70139e Jan 21 00:59:07.499998 containerd[1680]: 2026-01-21 00:59:07.457 [INFO][4451] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.74.0/26 handle="k8s-pod-network.b41a24779b154008d2112ea390adbba2d6266c99e501518124c1e4050e70139e" host="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:59:07.499998 containerd[1680]: 2026-01-21 00:59:07.467 [INFO][4451] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.74.4/26] block=192.168.74.0/26 handle="k8s-pod-network.b41a24779b154008d2112ea390adbba2d6266c99e501518124c1e4050e70139e" host="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:59:07.499998 containerd[1680]: 2026-01-21 00:59:07.467 [INFO][4451] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.74.4/26] handle="k8s-pod-network.b41a24779b154008d2112ea390adbba2d6266c99e501518124c1e4050e70139e" host="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:59:07.499998 containerd[1680]: 2026-01-21 00:59:07.467 [INFO][4451] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 21 00:59:07.499998 containerd[1680]: 2026-01-21 00:59:07.467 [INFO][4451] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.74.4/26] IPv6=[] ContainerID="b41a24779b154008d2112ea390adbba2d6266c99e501518124c1e4050e70139e" HandleID="k8s-pod-network.b41a24779b154008d2112ea390adbba2d6266c99e501518124c1e4050e70139e" Workload="ci--4547--0--0--n--af1f1f5a24-k8s-calico--apiserver--54cbb8896b--hxx9z-eth0" Jan 21 00:59:07.501758 containerd[1680]: 2026-01-21 00:59:07.472 [INFO][4434] cni-plugin/k8s.go 418: Populated endpoint ContainerID="b41a24779b154008d2112ea390adbba2d6266c99e501518124c1e4050e70139e" Namespace="calico-apiserver" Pod="calico-apiserver-54cbb8896b-hxx9z" WorkloadEndpoint="ci--4547--0--0--n--af1f1f5a24-k8s-calico--apiserver--54cbb8896b--hxx9z-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--n--af1f1f5a24-k8s-calico--apiserver--54cbb8896b--hxx9z-eth0", GenerateName:"calico-apiserver-54cbb8896b-", Namespace:"calico-apiserver", SelfLink:"", UID:"22543a80-9d55-4110-b0da-aa35bd7688e7", ResourceVersion:"790", Generation:0, CreationTimestamp:time.Date(2026, time.January, 21, 0, 58, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"54cbb8896b", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-n-af1f1f5a24", ContainerID:"", Pod:"calico-apiserver-54cbb8896b-hxx9z", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.74.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali32ec7d30da0", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 21 00:59:07.501758 containerd[1680]: 2026-01-21 00:59:07.472 [INFO][4434] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.74.4/32] ContainerID="b41a24779b154008d2112ea390adbba2d6266c99e501518124c1e4050e70139e" Namespace="calico-apiserver" Pod="calico-apiserver-54cbb8896b-hxx9z" WorkloadEndpoint="ci--4547--0--0--n--af1f1f5a24-k8s-calico--apiserver--54cbb8896b--hxx9z-eth0" Jan 21 00:59:07.501758 containerd[1680]: 2026-01-21 00:59:07.472 [INFO][4434] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali32ec7d30da0 ContainerID="b41a24779b154008d2112ea390adbba2d6266c99e501518124c1e4050e70139e" Namespace="calico-apiserver" Pod="calico-apiserver-54cbb8896b-hxx9z" WorkloadEndpoint="ci--4547--0--0--n--af1f1f5a24-k8s-calico--apiserver--54cbb8896b--hxx9z-eth0" Jan 21 00:59:07.501758 containerd[1680]: 2026-01-21 00:59:07.479 [INFO][4434] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b41a24779b154008d2112ea390adbba2d6266c99e501518124c1e4050e70139e" Namespace="calico-apiserver" Pod="calico-apiserver-54cbb8896b-hxx9z" WorkloadEndpoint="ci--4547--0--0--n--af1f1f5a24-k8s-calico--apiserver--54cbb8896b--hxx9z-eth0" Jan 21 00:59:07.501758 containerd[1680]: 2026-01-21 00:59:07.480 [INFO][4434] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="b41a24779b154008d2112ea390adbba2d6266c99e501518124c1e4050e70139e" Namespace="calico-apiserver" Pod="calico-apiserver-54cbb8896b-hxx9z" WorkloadEndpoint="ci--4547--0--0--n--af1f1f5a24-k8s-calico--apiserver--54cbb8896b--hxx9z-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--n--af1f1f5a24-k8s-calico--apiserver--54cbb8896b--hxx9z-eth0", GenerateName:"calico-apiserver-54cbb8896b-", Namespace:"calico-apiserver", SelfLink:"", UID:"22543a80-9d55-4110-b0da-aa35bd7688e7", ResourceVersion:"790", Generation:0, CreationTimestamp:time.Date(2026, time.January, 21, 0, 58, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"54cbb8896b", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-n-af1f1f5a24", ContainerID:"b41a24779b154008d2112ea390adbba2d6266c99e501518124c1e4050e70139e", Pod:"calico-apiserver-54cbb8896b-hxx9z", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.74.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali32ec7d30da0", MAC:"4a:4f:2f:16:10:8e", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 21 00:59:07.501758 containerd[1680]: 2026-01-21 00:59:07.496 [INFO][4434] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="b41a24779b154008d2112ea390adbba2d6266c99e501518124c1e4050e70139e" Namespace="calico-apiserver" Pod="calico-apiserver-54cbb8896b-hxx9z" WorkloadEndpoint="ci--4547--0--0--n--af1f1f5a24-k8s-calico--apiserver--54cbb8896b--hxx9z-eth0" Jan 21 00:59:07.528552 containerd[1680]: time="2026-01-21T00:59:07.528513556Z" level=info msg="connecting to shim b41a24779b154008d2112ea390adbba2d6266c99e501518124c1e4050e70139e" address="unix:///run/containerd/s/612282b66978ab07eddc9ef9bdda5062455947c36fd8535e307619a56afb731d" namespace=k8s.io protocol=ttrpc version=3 Jan 21 00:59:07.542000 audit[4540]: NETFILTER_CFG table=filter:131 family=2 entries=58 op=nft_register_chain pid=4540 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 21 00:59:07.542000 audit[4540]: SYSCALL arch=c000003e syscall=46 success=yes exit=30584 a0=3 a1=7ffeb794f880 a2=0 a3=7ffeb794f86c items=0 ppid=4119 pid=4540 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:07.542000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 21 00:59:07.556891 systemd[1]: Started cri-containerd-b41a24779b154008d2112ea390adbba2d6266c99e501518124c1e4050e70139e.scope - libcontainer container b41a24779b154008d2112ea390adbba2d6266c99e501518124c1e4050e70139e. Jan 21 00:59:07.568831 containerd[1680]: time="2026-01-21T00:59:07.568775437Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-bc7877fd9-wf2hg,Uid:59ce7fac-1a6e-4ec4-b99e-063ed3e3444c,Namespace:calico-system,Attempt:0,} returns sandbox id \"d854e0f01e0a5fdd01a96244b0f9f2c382518e2232c7ce347d17e2a30175048b\"" Jan 21 00:59:07.571062 containerd[1680]: time="2026-01-21T00:59:07.571035770Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 21 00:59:07.571000 audit: BPF prog-id=226 op=LOAD Jan 21 00:59:07.572000 audit: BPF prog-id=227 op=LOAD Jan 21 00:59:07.572000 audit[4542]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000228238 a2=98 a3=0 items=0 ppid=4530 pid=4542 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:07.572000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6234316132343737396231353430303864323131326561333930616462 Jan 21 00:59:07.572000 audit: BPF prog-id=227 op=UNLOAD Jan 21 00:59:07.572000 audit[4542]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4530 pid=4542 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:07.572000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6234316132343737396231353430303864323131326561333930616462 Jan 21 00:59:07.573000 audit: BPF prog-id=228 op=LOAD Jan 21 00:59:07.573000 audit[4542]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000228488 a2=98 a3=0 items=0 ppid=4530 pid=4542 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:07.573000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6234316132343737396231353430303864323131326561333930616462 Jan 21 00:59:07.573000 audit: BPF prog-id=229 op=LOAD Jan 21 00:59:07.573000 audit[4542]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000228218 a2=98 a3=0 items=0 ppid=4530 pid=4542 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:07.573000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6234316132343737396231353430303864323131326561333930616462 Jan 21 00:59:07.573000 audit: BPF prog-id=229 op=UNLOAD Jan 21 00:59:07.573000 audit[4542]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4530 pid=4542 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:07.573000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6234316132343737396231353430303864323131326561333930616462 Jan 21 00:59:07.573000 audit: BPF prog-id=228 op=UNLOAD Jan 21 00:59:07.573000 audit[4542]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4530 pid=4542 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:07.573000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6234316132343737396231353430303864323131326561333930616462 Jan 21 00:59:07.573000 audit: BPF prog-id=230 op=LOAD Jan 21 00:59:07.573000 audit[4542]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0002286e8 a2=98 a3=0 items=0 ppid=4530 pid=4542 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:07.573000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6234316132343737396231353430303864323131326561333930616462 Jan 21 00:59:07.613522 containerd[1680]: time="2026-01-21T00:59:07.613490399Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-54cbb8896b-hxx9z,Uid:22543a80-9d55-4110-b0da-aa35bd7688e7,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"b41a24779b154008d2112ea390adbba2d6266c99e501518124c1e4050e70139e\"" Jan 21 00:59:07.661092 systemd-networkd[1587]: calia400dbe04ef: Gained IPv6LL Jan 21 00:59:07.908023 containerd[1680]: time="2026-01-21T00:59:07.907824170Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 21 00:59:07.909842 containerd[1680]: time="2026-01-21T00:59:07.909797959Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 21 00:59:07.909996 containerd[1680]: time="2026-01-21T00:59:07.909870894Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 21 00:59:07.910190 kubelet[2901]: E0121 00:59:07.910151 2901 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 21 00:59:07.910244 kubelet[2901]: E0121 00:59:07.910201 2901 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 21 00:59:07.910495 kubelet[2901]: E0121 00:59:07.910411 2901 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q6z62,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-bc7877fd9-wf2hg_calico-system(59ce7fac-1a6e-4ec4-b99e-063ed3e3444c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 21 00:59:07.911558 containerd[1680]: time="2026-01-21T00:59:07.910437123Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 21 00:59:07.911876 kubelet[2901]: E0121 00:59:07.911856 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-bc7877fd9-wf2hg" podUID="59ce7fac-1a6e-4ec4-b99e-063ed3e3444c" Jan 21 00:59:08.211059 containerd[1680]: time="2026-01-21T00:59:08.211018749Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-95c8v,Uid:be64252f-80c4-46f6-a3d4-52a6471b1a63,Namespace:calico-system,Attempt:0,}" Jan 21 00:59:08.211248 containerd[1680]: time="2026-01-21T00:59:08.211227020Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-8lkb9,Uid:851d5829-334a-4f46-97de-87be973a0b77,Namespace:calico-system,Attempt:0,}" Jan 21 00:59:08.247785 containerd[1680]: time="2026-01-21T00:59:08.247747739Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 21 00:59:08.249178 containerd[1680]: time="2026-01-21T00:59:08.249145313Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 21 00:59:08.249241 containerd[1680]: time="2026-01-21T00:59:08.249229849Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 21 00:59:08.249520 kubelet[2901]: E0121 00:59:08.249454 2901 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 21 00:59:08.249520 kubelet[2901]: E0121 00:59:08.249496 2901 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 21 00:59:08.250237 kubelet[2901]: E0121 00:59:08.249891 2901 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9dbdq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-54cbb8896b-hxx9z_calico-apiserver(22543a80-9d55-4110-b0da-aa35bd7688e7): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 21 00:59:08.251518 kubelet[2901]: E0121 00:59:08.251491 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-54cbb8896b-hxx9z" podUID="22543a80-9d55-4110-b0da-aa35bd7688e7" Jan 21 00:59:08.324080 kubelet[2901]: E0121 00:59:08.324027 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-54cbb8896b-hxx9z" podUID="22543a80-9d55-4110-b0da-aa35bd7688e7" Jan 21 00:59:08.324653 kubelet[2901]: E0121 00:59:08.324592 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-bc7877fd9-wf2hg" podUID="59ce7fac-1a6e-4ec4-b99e-063ed3e3444c" Jan 21 00:59:08.352197 systemd-networkd[1587]: cali7e596df7f28: Link UP Jan 21 00:59:08.354510 systemd-networkd[1587]: cali7e596df7f28: Gained carrier Jan 21 00:59:08.371570 containerd[1680]: 2026-01-21 00:59:08.270 [INFO][4580] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--n--af1f1f5a24-k8s-csi--node--driver--8lkb9-eth0 csi-node-driver- calico-system 851d5829-334a-4f46-97de-87be973a0b77 685 0 2026-01-21 00:58:43 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:857b56db8f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4547-0-0-n-af1f1f5a24 csi-node-driver-8lkb9 eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali7e596df7f28 [] [] }} ContainerID="532c7757f32c711e30b4069a8fdd570f7bcc2e3cc03598ea64a60536adaa8a3d" Namespace="calico-system" Pod="csi-node-driver-8lkb9" WorkloadEndpoint="ci--4547--0--0--n--af1f1f5a24-k8s-csi--node--driver--8lkb9-" Jan 21 00:59:08.371570 containerd[1680]: 2026-01-21 00:59:08.270 [INFO][4580] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="532c7757f32c711e30b4069a8fdd570f7bcc2e3cc03598ea64a60536adaa8a3d" Namespace="calico-system" Pod="csi-node-driver-8lkb9" WorkloadEndpoint="ci--4547--0--0--n--af1f1f5a24-k8s-csi--node--driver--8lkb9-eth0" Jan 21 00:59:08.371570 containerd[1680]: 2026-01-21 00:59:08.299 [INFO][4603] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="532c7757f32c711e30b4069a8fdd570f7bcc2e3cc03598ea64a60536adaa8a3d" HandleID="k8s-pod-network.532c7757f32c711e30b4069a8fdd570f7bcc2e3cc03598ea64a60536adaa8a3d" Workload="ci--4547--0--0--n--af1f1f5a24-k8s-csi--node--driver--8lkb9-eth0" Jan 21 00:59:08.371570 containerd[1680]: 2026-01-21 00:59:08.299 [INFO][4603] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="532c7757f32c711e30b4069a8fdd570f7bcc2e3cc03598ea64a60536adaa8a3d" HandleID="k8s-pod-network.532c7757f32c711e30b4069a8fdd570f7bcc2e3cc03598ea64a60536adaa8a3d" Workload="ci--4547--0--0--n--af1f1f5a24-k8s-csi--node--driver--8lkb9-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5660), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4547-0-0-n-af1f1f5a24", "pod":"csi-node-driver-8lkb9", "timestamp":"2026-01-21 00:59:08.299384546 +0000 UTC"}, Hostname:"ci-4547-0-0-n-af1f1f5a24", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 21 00:59:08.371570 containerd[1680]: 2026-01-21 00:59:08.299 [INFO][4603] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 21 00:59:08.371570 containerd[1680]: 2026-01-21 00:59:08.299 [INFO][4603] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 21 00:59:08.371570 containerd[1680]: 2026-01-21 00:59:08.299 [INFO][4603] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-n-af1f1f5a24' Jan 21 00:59:08.371570 containerd[1680]: 2026-01-21 00:59:08.306 [INFO][4603] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.532c7757f32c711e30b4069a8fdd570f7bcc2e3cc03598ea64a60536adaa8a3d" host="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:59:08.371570 containerd[1680]: 2026-01-21 00:59:08.310 [INFO][4603] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:59:08.371570 containerd[1680]: 2026-01-21 00:59:08.314 [INFO][4603] ipam/ipam.go 511: Trying affinity for 192.168.74.0/26 host="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:59:08.371570 containerd[1680]: 2026-01-21 00:59:08.316 [INFO][4603] ipam/ipam.go 158: Attempting to load block cidr=192.168.74.0/26 host="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:59:08.371570 containerd[1680]: 2026-01-21 00:59:08.317 [INFO][4603] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.74.0/26 host="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:59:08.371570 containerd[1680]: 2026-01-21 00:59:08.317 [INFO][4603] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.74.0/26 handle="k8s-pod-network.532c7757f32c711e30b4069a8fdd570f7bcc2e3cc03598ea64a60536adaa8a3d" host="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:59:08.371570 containerd[1680]: 2026-01-21 00:59:08.319 [INFO][4603] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.532c7757f32c711e30b4069a8fdd570f7bcc2e3cc03598ea64a60536adaa8a3d Jan 21 00:59:08.371570 containerd[1680]: 2026-01-21 00:59:08.327 [INFO][4603] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.74.0/26 handle="k8s-pod-network.532c7757f32c711e30b4069a8fdd570f7bcc2e3cc03598ea64a60536adaa8a3d" host="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:59:08.371570 containerd[1680]: 2026-01-21 00:59:08.336 [INFO][4603] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.74.5/26] block=192.168.74.0/26 handle="k8s-pod-network.532c7757f32c711e30b4069a8fdd570f7bcc2e3cc03598ea64a60536adaa8a3d" host="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:59:08.371570 containerd[1680]: 2026-01-21 00:59:08.337 [INFO][4603] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.74.5/26] handle="k8s-pod-network.532c7757f32c711e30b4069a8fdd570f7bcc2e3cc03598ea64a60536adaa8a3d" host="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:59:08.371570 containerd[1680]: 2026-01-21 00:59:08.337 [INFO][4603] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 21 00:59:08.371570 containerd[1680]: 2026-01-21 00:59:08.337 [INFO][4603] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.74.5/26] IPv6=[] ContainerID="532c7757f32c711e30b4069a8fdd570f7bcc2e3cc03598ea64a60536adaa8a3d" HandleID="k8s-pod-network.532c7757f32c711e30b4069a8fdd570f7bcc2e3cc03598ea64a60536adaa8a3d" Workload="ci--4547--0--0--n--af1f1f5a24-k8s-csi--node--driver--8lkb9-eth0" Jan 21 00:59:08.371000 audit[4621]: NETFILTER_CFG table=filter:132 family=2 entries=14 op=nft_register_rule pid=4621 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 21 00:59:08.371000 audit[4621]: SYSCALL arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7ffd6347b140 a2=0 a3=7ffd6347b12c items=0 ppid=3056 pid=4621 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:08.371000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 21 00:59:08.373059 containerd[1680]: 2026-01-21 00:59:08.341 [INFO][4580] cni-plugin/k8s.go 418: Populated endpoint ContainerID="532c7757f32c711e30b4069a8fdd570f7bcc2e3cc03598ea64a60536adaa8a3d" Namespace="calico-system" Pod="csi-node-driver-8lkb9" WorkloadEndpoint="ci--4547--0--0--n--af1f1f5a24-k8s-csi--node--driver--8lkb9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--n--af1f1f5a24-k8s-csi--node--driver--8lkb9-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"851d5829-334a-4f46-97de-87be973a0b77", ResourceVersion:"685", Generation:0, CreationTimestamp:time.Date(2026, time.January, 21, 0, 58, 43, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-n-af1f1f5a24", ContainerID:"", Pod:"csi-node-driver-8lkb9", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.74.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali7e596df7f28", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 21 00:59:08.373059 containerd[1680]: 2026-01-21 00:59:08.342 [INFO][4580] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.74.5/32] ContainerID="532c7757f32c711e30b4069a8fdd570f7bcc2e3cc03598ea64a60536adaa8a3d" Namespace="calico-system" Pod="csi-node-driver-8lkb9" WorkloadEndpoint="ci--4547--0--0--n--af1f1f5a24-k8s-csi--node--driver--8lkb9-eth0" Jan 21 00:59:08.373059 containerd[1680]: 2026-01-21 00:59:08.342 [INFO][4580] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali7e596df7f28 ContainerID="532c7757f32c711e30b4069a8fdd570f7bcc2e3cc03598ea64a60536adaa8a3d" Namespace="calico-system" Pod="csi-node-driver-8lkb9" WorkloadEndpoint="ci--4547--0--0--n--af1f1f5a24-k8s-csi--node--driver--8lkb9-eth0" Jan 21 00:59:08.373059 containerd[1680]: 2026-01-21 00:59:08.356 [INFO][4580] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="532c7757f32c711e30b4069a8fdd570f7bcc2e3cc03598ea64a60536adaa8a3d" Namespace="calico-system" Pod="csi-node-driver-8lkb9" WorkloadEndpoint="ci--4547--0--0--n--af1f1f5a24-k8s-csi--node--driver--8lkb9-eth0" Jan 21 00:59:08.373059 containerd[1680]: 2026-01-21 00:59:08.357 [INFO][4580] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="532c7757f32c711e30b4069a8fdd570f7bcc2e3cc03598ea64a60536adaa8a3d" Namespace="calico-system" Pod="csi-node-driver-8lkb9" WorkloadEndpoint="ci--4547--0--0--n--af1f1f5a24-k8s-csi--node--driver--8lkb9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--n--af1f1f5a24-k8s-csi--node--driver--8lkb9-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"851d5829-334a-4f46-97de-87be973a0b77", ResourceVersion:"685", Generation:0, CreationTimestamp:time.Date(2026, time.January, 21, 0, 58, 43, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-n-af1f1f5a24", ContainerID:"532c7757f32c711e30b4069a8fdd570f7bcc2e3cc03598ea64a60536adaa8a3d", Pod:"csi-node-driver-8lkb9", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.74.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali7e596df7f28", MAC:"12:20:da:08:80:00", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 21 00:59:08.373059 containerd[1680]: 2026-01-21 00:59:08.368 [INFO][4580] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="532c7757f32c711e30b4069a8fdd570f7bcc2e3cc03598ea64a60536adaa8a3d" Namespace="calico-system" Pod="csi-node-driver-8lkb9" WorkloadEndpoint="ci--4547--0--0--n--af1f1f5a24-k8s-csi--node--driver--8lkb9-eth0" Jan 21 00:59:08.376000 audit[4621]: NETFILTER_CFG table=nat:133 family=2 entries=20 op=nft_register_rule pid=4621 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 21 00:59:08.376000 audit[4621]: SYSCALL arch=c000003e syscall=46 success=yes exit=5772 a0=3 a1=7ffd6347b140 a2=0 a3=7ffd6347b12c items=0 ppid=3056 pid=4621 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:08.376000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 21 00:59:08.387000 audit[4628]: NETFILTER_CFG table=filter:134 family=2 entries=54 op=nft_register_chain pid=4628 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 21 00:59:08.387000 audit[4628]: SYSCALL arch=c000003e syscall=46 success=yes exit=25992 a0=3 a1=7ffc9ba67730 a2=0 a3=7ffc9ba6771c items=0 ppid=4119 pid=4628 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:08.387000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 21 00:59:08.402239 containerd[1680]: time="2026-01-21T00:59:08.402157022Z" level=info msg="connecting to shim 532c7757f32c711e30b4069a8fdd570f7bcc2e3cc03598ea64a60536adaa8a3d" address="unix:///run/containerd/s/c48c5c8f7a2a7b9ae6065674abe5ca63d03d2d12b3c77312f760016dcf4d0d24" namespace=k8s.io protocol=ttrpc version=3 Jan 21 00:59:08.432926 systemd[1]: Started cri-containerd-532c7757f32c711e30b4069a8fdd570f7bcc2e3cc03598ea64a60536adaa8a3d.scope - libcontainer container 532c7757f32c711e30b4069a8fdd570f7bcc2e3cc03598ea64a60536adaa8a3d. Jan 21 00:59:08.444764 systemd-networkd[1587]: cali86c93394697: Link UP Jan 21 00:59:08.445339 systemd-networkd[1587]: cali86c93394697: Gained carrier Jan 21 00:59:08.457000 audit: BPF prog-id=231 op=LOAD Jan 21 00:59:08.458000 audit: BPF prog-id=232 op=LOAD Jan 21 00:59:08.458000 audit[4649]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b0238 a2=98 a3=0 items=0 ppid=4638 pid=4649 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:08.458000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3533326337373537663332633731316533306234303639613866646435 Jan 21 00:59:08.458000 audit: BPF prog-id=232 op=UNLOAD Jan 21 00:59:08.458000 audit[4649]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4638 pid=4649 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:08.458000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3533326337373537663332633731316533306234303639613866646435 Jan 21 00:59:08.459000 audit: BPF prog-id=233 op=LOAD Jan 21 00:59:08.459000 audit[4649]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b0488 a2=98 a3=0 items=0 ppid=4638 pid=4649 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:08.459000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3533326337373537663332633731316533306234303639613866646435 Jan 21 00:59:08.459000 audit: BPF prog-id=234 op=LOAD Jan 21 00:59:08.459000 audit[4649]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c0001b0218 a2=98 a3=0 items=0 ppid=4638 pid=4649 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:08.459000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3533326337373537663332633731316533306234303639613866646435 Jan 21 00:59:08.459000 audit: BPF prog-id=234 op=UNLOAD Jan 21 00:59:08.459000 audit[4649]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4638 pid=4649 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:08.459000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3533326337373537663332633731316533306234303639613866646435 Jan 21 00:59:08.459000 audit: BPF prog-id=233 op=UNLOAD Jan 21 00:59:08.459000 audit[4649]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4638 pid=4649 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:08.459000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3533326337373537663332633731316533306234303639613866646435 Jan 21 00:59:08.459000 audit: BPF prog-id=235 op=LOAD Jan 21 00:59:08.459000 audit[4649]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b06e8 a2=98 a3=0 items=0 ppid=4638 pid=4649 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:08.459000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3533326337373537663332633731316533306234303639613866646435 Jan 21 00:59:08.468795 containerd[1680]: 2026-01-21 00:59:08.276 [INFO][4584] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--n--af1f1f5a24-k8s-goldmane--666569f655--95c8v-eth0 goldmane-666569f655- calico-system be64252f-80c4-46f6-a3d4-52a6471b1a63 789 0 2026-01-21 00:58:41 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:666569f655 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4547-0-0-n-af1f1f5a24 goldmane-666569f655-95c8v eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali86c93394697 [] [] }} ContainerID="4b9f8549f6813c7071a2a87c075ddc321ddf59b7476053607a48cc9cdb34490d" Namespace="calico-system" Pod="goldmane-666569f655-95c8v" WorkloadEndpoint="ci--4547--0--0--n--af1f1f5a24-k8s-goldmane--666569f655--95c8v-" Jan 21 00:59:08.468795 containerd[1680]: 2026-01-21 00:59:08.276 [INFO][4584] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="4b9f8549f6813c7071a2a87c075ddc321ddf59b7476053607a48cc9cdb34490d" Namespace="calico-system" Pod="goldmane-666569f655-95c8v" WorkloadEndpoint="ci--4547--0--0--n--af1f1f5a24-k8s-goldmane--666569f655--95c8v-eth0" Jan 21 00:59:08.468795 containerd[1680]: 2026-01-21 00:59:08.301 [INFO][4608] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="4b9f8549f6813c7071a2a87c075ddc321ddf59b7476053607a48cc9cdb34490d" HandleID="k8s-pod-network.4b9f8549f6813c7071a2a87c075ddc321ddf59b7476053607a48cc9cdb34490d" Workload="ci--4547--0--0--n--af1f1f5a24-k8s-goldmane--666569f655--95c8v-eth0" Jan 21 00:59:08.468795 containerd[1680]: 2026-01-21 00:59:08.301 [INFO][4608] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="4b9f8549f6813c7071a2a87c075ddc321ddf59b7476053607a48cc9cdb34490d" HandleID="k8s-pod-network.4b9f8549f6813c7071a2a87c075ddc321ddf59b7476053607a48cc9cdb34490d" Workload="ci--4547--0--0--n--af1f1f5a24-k8s-goldmane--666569f655--95c8v-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5910), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4547-0-0-n-af1f1f5a24", "pod":"goldmane-666569f655-95c8v", "timestamp":"2026-01-21 00:59:08.301171779 +0000 UTC"}, Hostname:"ci-4547-0-0-n-af1f1f5a24", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 21 00:59:08.468795 containerd[1680]: 2026-01-21 00:59:08.301 [INFO][4608] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 21 00:59:08.468795 containerd[1680]: 2026-01-21 00:59:08.337 [INFO][4608] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 21 00:59:08.468795 containerd[1680]: 2026-01-21 00:59:08.337 [INFO][4608] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-n-af1f1f5a24' Jan 21 00:59:08.468795 containerd[1680]: 2026-01-21 00:59:08.410 [INFO][4608] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.4b9f8549f6813c7071a2a87c075ddc321ddf59b7476053607a48cc9cdb34490d" host="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:59:08.468795 containerd[1680]: 2026-01-21 00:59:08.417 [INFO][4608] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:59:08.468795 containerd[1680]: 2026-01-21 00:59:08.421 [INFO][4608] ipam/ipam.go 511: Trying affinity for 192.168.74.0/26 host="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:59:08.468795 containerd[1680]: 2026-01-21 00:59:08.423 [INFO][4608] ipam/ipam.go 158: Attempting to load block cidr=192.168.74.0/26 host="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:59:08.468795 containerd[1680]: 2026-01-21 00:59:08.425 [INFO][4608] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.74.0/26 host="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:59:08.468795 containerd[1680]: 2026-01-21 00:59:08.425 [INFO][4608] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.74.0/26 handle="k8s-pod-network.4b9f8549f6813c7071a2a87c075ddc321ddf59b7476053607a48cc9cdb34490d" host="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:59:08.468795 containerd[1680]: 2026-01-21 00:59:08.428 [INFO][4608] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.4b9f8549f6813c7071a2a87c075ddc321ddf59b7476053607a48cc9cdb34490d Jan 21 00:59:08.468795 containerd[1680]: 2026-01-21 00:59:08.434 [INFO][4608] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.74.0/26 handle="k8s-pod-network.4b9f8549f6813c7071a2a87c075ddc321ddf59b7476053607a48cc9cdb34490d" host="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:59:08.468795 containerd[1680]: 2026-01-21 00:59:08.440 [INFO][4608] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.74.6/26] block=192.168.74.0/26 handle="k8s-pod-network.4b9f8549f6813c7071a2a87c075ddc321ddf59b7476053607a48cc9cdb34490d" host="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:59:08.468795 containerd[1680]: 2026-01-21 00:59:08.440 [INFO][4608] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.74.6/26] handle="k8s-pod-network.4b9f8549f6813c7071a2a87c075ddc321ddf59b7476053607a48cc9cdb34490d" host="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:59:08.468795 containerd[1680]: 2026-01-21 00:59:08.440 [INFO][4608] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 21 00:59:08.468795 containerd[1680]: 2026-01-21 00:59:08.440 [INFO][4608] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.74.6/26] IPv6=[] ContainerID="4b9f8549f6813c7071a2a87c075ddc321ddf59b7476053607a48cc9cdb34490d" HandleID="k8s-pod-network.4b9f8549f6813c7071a2a87c075ddc321ddf59b7476053607a48cc9cdb34490d" Workload="ci--4547--0--0--n--af1f1f5a24-k8s-goldmane--666569f655--95c8v-eth0" Jan 21 00:59:08.469324 containerd[1680]: 2026-01-21 00:59:08.442 [INFO][4584] cni-plugin/k8s.go 418: Populated endpoint ContainerID="4b9f8549f6813c7071a2a87c075ddc321ddf59b7476053607a48cc9cdb34490d" Namespace="calico-system" Pod="goldmane-666569f655-95c8v" WorkloadEndpoint="ci--4547--0--0--n--af1f1f5a24-k8s-goldmane--666569f655--95c8v-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--n--af1f1f5a24-k8s-goldmane--666569f655--95c8v-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"be64252f-80c4-46f6-a3d4-52a6471b1a63", ResourceVersion:"789", Generation:0, CreationTimestamp:time.Date(2026, time.January, 21, 0, 58, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-n-af1f1f5a24", ContainerID:"", Pod:"goldmane-666569f655-95c8v", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.74.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali86c93394697", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 21 00:59:08.469324 containerd[1680]: 2026-01-21 00:59:08.442 [INFO][4584] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.74.6/32] ContainerID="4b9f8549f6813c7071a2a87c075ddc321ddf59b7476053607a48cc9cdb34490d" Namespace="calico-system" Pod="goldmane-666569f655-95c8v" WorkloadEndpoint="ci--4547--0--0--n--af1f1f5a24-k8s-goldmane--666569f655--95c8v-eth0" Jan 21 00:59:08.469324 containerd[1680]: 2026-01-21 00:59:08.442 [INFO][4584] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali86c93394697 ContainerID="4b9f8549f6813c7071a2a87c075ddc321ddf59b7476053607a48cc9cdb34490d" Namespace="calico-system" Pod="goldmane-666569f655-95c8v" WorkloadEndpoint="ci--4547--0--0--n--af1f1f5a24-k8s-goldmane--666569f655--95c8v-eth0" Jan 21 00:59:08.469324 containerd[1680]: 2026-01-21 00:59:08.446 [INFO][4584] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="4b9f8549f6813c7071a2a87c075ddc321ddf59b7476053607a48cc9cdb34490d" Namespace="calico-system" Pod="goldmane-666569f655-95c8v" WorkloadEndpoint="ci--4547--0--0--n--af1f1f5a24-k8s-goldmane--666569f655--95c8v-eth0" Jan 21 00:59:08.469324 containerd[1680]: 2026-01-21 00:59:08.449 [INFO][4584] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="4b9f8549f6813c7071a2a87c075ddc321ddf59b7476053607a48cc9cdb34490d" Namespace="calico-system" Pod="goldmane-666569f655-95c8v" WorkloadEndpoint="ci--4547--0--0--n--af1f1f5a24-k8s-goldmane--666569f655--95c8v-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--n--af1f1f5a24-k8s-goldmane--666569f655--95c8v-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"be64252f-80c4-46f6-a3d4-52a6471b1a63", ResourceVersion:"789", Generation:0, CreationTimestamp:time.Date(2026, time.January, 21, 0, 58, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-n-af1f1f5a24", ContainerID:"4b9f8549f6813c7071a2a87c075ddc321ddf59b7476053607a48cc9cdb34490d", Pod:"goldmane-666569f655-95c8v", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.74.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali86c93394697", MAC:"16:ed:af:23:00:e2", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 21 00:59:08.469324 containerd[1680]: 2026-01-21 00:59:08.466 [INFO][4584] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="4b9f8549f6813c7071a2a87c075ddc321ddf59b7476053607a48cc9cdb34490d" Namespace="calico-system" Pod="goldmane-666569f655-95c8v" WorkloadEndpoint="ci--4547--0--0--n--af1f1f5a24-k8s-goldmane--666569f655--95c8v-eth0" Jan 21 00:59:08.499907 containerd[1680]: time="2026-01-21T00:59:08.499853117Z" level=info msg="connecting to shim 4b9f8549f6813c7071a2a87c075ddc321ddf59b7476053607a48cc9cdb34490d" address="unix:///run/containerd/s/8832e37c1bcc4ddb3005c0b2f2c06b36a8455396382107411f900c5dd55ef67a" namespace=k8s.io protocol=ttrpc version=3 Jan 21 00:59:08.500635 containerd[1680]: time="2026-01-21T00:59:08.500611588Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-8lkb9,Uid:851d5829-334a-4f46-97de-87be973a0b77,Namespace:calico-system,Attempt:0,} returns sandbox id \"532c7757f32c711e30b4069a8fdd570f7bcc2e3cc03598ea64a60536adaa8a3d\"" Jan 21 00:59:08.505326 containerd[1680]: time="2026-01-21T00:59:08.505300202Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 21 00:59:08.504000 audit[4695]: NETFILTER_CFG table=filter:135 family=2 entries=56 op=nft_register_chain pid=4695 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 21 00:59:08.504000 audit[4695]: SYSCALL arch=c000003e syscall=46 success=yes exit=28728 a0=3 a1=7fffc4d9cc30 a2=0 a3=7fffc4d9cc1c items=0 ppid=4119 pid=4695 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:08.504000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 21 00:59:08.529927 systemd[1]: Started cri-containerd-4b9f8549f6813c7071a2a87c075ddc321ddf59b7476053607a48cc9cdb34490d.scope - libcontainer container 4b9f8549f6813c7071a2a87c075ddc321ddf59b7476053607a48cc9cdb34490d. Jan 21 00:59:08.539000 audit: BPF prog-id=236 op=LOAD Jan 21 00:59:08.540000 audit: BPF prog-id=237 op=LOAD Jan 21 00:59:08.540000 audit[4707]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=4694 pid=4707 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:08.540000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3462396638353439663638313363373037316132613837633037356464 Jan 21 00:59:08.540000 audit: BPF prog-id=237 op=UNLOAD Jan 21 00:59:08.540000 audit[4707]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4694 pid=4707 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:08.540000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3462396638353439663638313363373037316132613837633037356464 Jan 21 00:59:08.540000 audit: BPF prog-id=238 op=LOAD Jan 21 00:59:08.540000 audit[4707]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=4694 pid=4707 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:08.540000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3462396638353439663638313363373037316132613837633037356464 Jan 21 00:59:08.540000 audit: BPF prog-id=239 op=LOAD Jan 21 00:59:08.540000 audit[4707]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=4694 pid=4707 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:08.540000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3462396638353439663638313363373037316132613837633037356464 Jan 21 00:59:08.540000 audit: BPF prog-id=239 op=UNLOAD Jan 21 00:59:08.540000 audit[4707]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=4694 pid=4707 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:08.540000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3462396638353439663638313363373037316132613837633037356464 Jan 21 00:59:08.541000 audit: BPF prog-id=238 op=UNLOAD Jan 21 00:59:08.541000 audit[4707]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4694 pid=4707 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:08.541000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3462396638353439663638313363373037316132613837633037356464 Jan 21 00:59:08.541000 audit: BPF prog-id=240 op=LOAD Jan 21 00:59:08.541000 audit[4707]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=4694 pid=4707 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:08.541000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3462396638353439663638313363373037316132613837633037356464 Jan 21 00:59:08.581327 containerd[1680]: time="2026-01-21T00:59:08.581233913Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-95c8v,Uid:be64252f-80c4-46f6-a3d4-52a6471b1a63,Namespace:calico-system,Attempt:0,} returns sandbox id \"4b9f8549f6813c7071a2a87c075ddc321ddf59b7476053607a48cc9cdb34490d\"" Jan 21 00:59:08.811871 systemd-networkd[1587]: cali6693bac0e8a: Gained IPv6LL Jan 21 00:59:08.847025 containerd[1680]: time="2026-01-21T00:59:08.846974165Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 21 00:59:08.848741 containerd[1680]: time="2026-01-21T00:59:08.848713296Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 21 00:59:08.849004 containerd[1680]: time="2026-01-21T00:59:08.848781215Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 21 00:59:08.849035 kubelet[2901]: E0121 00:59:08.848912 2901 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 21 00:59:08.849035 kubelet[2901]: E0121 00:59:08.848956 2901 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 21 00:59:08.849304 kubelet[2901]: E0121 00:59:08.849174 2901 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kvxfg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-8lkb9_calico-system(851d5829-334a-4f46-97de-87be973a0b77): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 21 00:59:08.850173 containerd[1680]: time="2026-01-21T00:59:08.850151516Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 21 00:59:09.068798 systemd-networkd[1587]: cali32ec7d30da0: Gained IPv6LL Jan 21 00:59:09.169266 containerd[1680]: time="2026-01-21T00:59:09.169229587Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 21 00:59:09.171602 containerd[1680]: time="2026-01-21T00:59:09.171506272Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 21 00:59:09.171602 containerd[1680]: time="2026-01-21T00:59:09.171577134Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 21 00:59:09.171840 kubelet[2901]: E0121 00:59:09.171808 2901 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 21 00:59:09.171879 kubelet[2901]: E0121 00:59:09.171852 2901 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 21 00:59:09.172116 kubelet[2901]: E0121 00:59:09.172041 2901 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-grgbg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-95c8v_calico-system(be64252f-80c4-46f6-a3d4-52a6471b1a63): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 21 00:59:09.173557 kubelet[2901]: E0121 00:59:09.173335 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-95c8v" podUID="be64252f-80c4-46f6-a3d4-52a6471b1a63" Jan 21 00:59:09.173628 containerd[1680]: time="2026-01-21T00:59:09.173387849Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 21 00:59:09.327535 kubelet[2901]: E0121 00:59:09.327420 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-95c8v" podUID="be64252f-80c4-46f6-a3d4-52a6471b1a63" Jan 21 00:59:09.329622 kubelet[2901]: E0121 00:59:09.329579 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-54cbb8896b-hxx9z" podUID="22543a80-9d55-4110-b0da-aa35bd7688e7" Jan 21 00:59:09.329789 kubelet[2901]: E0121 00:59:09.329605 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-bc7877fd9-wf2hg" podUID="59ce7fac-1a6e-4ec4-b99e-063ed3e3444c" Jan 21 00:59:09.354000 audit[4734]: NETFILTER_CFG table=filter:136 family=2 entries=14 op=nft_register_rule pid=4734 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 21 00:59:09.354000 audit[4734]: SYSCALL arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7ffe3d990420 a2=0 a3=7ffe3d99040c items=0 ppid=3056 pid=4734 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:09.354000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 21 00:59:09.359000 audit[4734]: NETFILTER_CFG table=nat:137 family=2 entries=20 op=nft_register_rule pid=4734 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 21 00:59:09.359000 audit[4734]: SYSCALL arch=c000003e syscall=46 success=yes exit=5772 a0=3 a1=7ffe3d990420 a2=0 a3=7ffe3d99040c items=0 ppid=3056 pid=4734 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:09.359000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 21 00:59:09.684903 containerd[1680]: time="2026-01-21T00:59:09.684791219Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 21 00:59:09.688608 containerd[1680]: time="2026-01-21T00:59:09.688563607Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 21 00:59:09.688711 containerd[1680]: time="2026-01-21T00:59:09.688643163Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 21 00:59:09.689040 kubelet[2901]: E0121 00:59:09.688820 2901 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 21 00:59:09.689040 kubelet[2901]: E0121 00:59:09.688891 2901 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 21 00:59:09.689040 kubelet[2901]: E0121 00:59:09.688999 2901 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kvxfg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-8lkb9_calico-system(851d5829-334a-4f46-97de-87be973a0b77): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 21 00:59:09.690321 kubelet[2901]: E0121 00:59:09.690284 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8lkb9" podUID="851d5829-334a-4f46-97de-87be973a0b77" Jan 21 00:59:09.963932 systemd-networkd[1587]: cali86c93394697: Gained IPv6LL Jan 21 00:59:10.211455 containerd[1680]: time="2026-01-21T00:59:10.211376652Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-54cbb8896b-89fcv,Uid:557eba89-d604-4304-afae-f0e623ef8722,Namespace:calico-apiserver,Attempt:0,}" Jan 21 00:59:10.211735 containerd[1680]: time="2026-01-21T00:59:10.211588212Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-78b44,Uid:9663badd-5b67-43ba-94ba-c1609d9bd7c0,Namespace:kube-system,Attempt:0,}" Jan 21 00:59:10.331894 kubelet[2901]: E0121 00:59:10.331798 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-95c8v" podUID="be64252f-80c4-46f6-a3d4-52a6471b1a63" Jan 21 00:59:10.336179 systemd-networkd[1587]: calia9aa0f3a790: Link UP Jan 21 00:59:10.343826 systemd-networkd[1587]: calia9aa0f3a790: Gained carrier Jan 21 00:59:10.346542 kubelet[2901]: E0121 00:59:10.345884 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8lkb9" podUID="851d5829-334a-4f46-97de-87be973a0b77" Jan 21 00:59:10.350350 systemd-networkd[1587]: cali7e596df7f28: Gained IPv6LL Jan 21 00:59:10.376314 containerd[1680]: 2026-01-21 00:59:10.263 [INFO][4740] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--n--af1f1f5a24-k8s-coredns--668d6bf9bc--78b44-eth0 coredns-668d6bf9bc- kube-system 9663badd-5b67-43ba-94ba-c1609d9bd7c0 788 0 2026-01-21 00:58:31 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4547-0-0-n-af1f1f5a24 coredns-668d6bf9bc-78b44 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calia9aa0f3a790 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="e3db235c9c6620f661b39eb4cff14290d68413175cc1ecdcc7f4a25d0082493e" Namespace="kube-system" Pod="coredns-668d6bf9bc-78b44" WorkloadEndpoint="ci--4547--0--0--n--af1f1f5a24-k8s-coredns--668d6bf9bc--78b44-" Jan 21 00:59:10.376314 containerd[1680]: 2026-01-21 00:59:10.263 [INFO][4740] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="e3db235c9c6620f661b39eb4cff14290d68413175cc1ecdcc7f4a25d0082493e" Namespace="kube-system" Pod="coredns-668d6bf9bc-78b44" WorkloadEndpoint="ci--4547--0--0--n--af1f1f5a24-k8s-coredns--668d6bf9bc--78b44-eth0" Jan 21 00:59:10.376314 containerd[1680]: 2026-01-21 00:59:10.294 [INFO][4761] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e3db235c9c6620f661b39eb4cff14290d68413175cc1ecdcc7f4a25d0082493e" HandleID="k8s-pod-network.e3db235c9c6620f661b39eb4cff14290d68413175cc1ecdcc7f4a25d0082493e" Workload="ci--4547--0--0--n--af1f1f5a24-k8s-coredns--668d6bf9bc--78b44-eth0" Jan 21 00:59:10.376314 containerd[1680]: 2026-01-21 00:59:10.294 [INFO][4761] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="e3db235c9c6620f661b39eb4cff14290d68413175cc1ecdcc7f4a25d0082493e" HandleID="k8s-pod-network.e3db235c9c6620f661b39eb4cff14290d68413175cc1ecdcc7f4a25d0082493e" Workload="ci--4547--0--0--n--af1f1f5a24-k8s-coredns--668d6bf9bc--78b44-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d56a0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4547-0-0-n-af1f1f5a24", "pod":"coredns-668d6bf9bc-78b44", "timestamp":"2026-01-21 00:59:10.294374762 +0000 UTC"}, Hostname:"ci-4547-0-0-n-af1f1f5a24", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 21 00:59:10.376314 containerd[1680]: 2026-01-21 00:59:10.294 [INFO][4761] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 21 00:59:10.376314 containerd[1680]: 2026-01-21 00:59:10.294 [INFO][4761] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 21 00:59:10.376314 containerd[1680]: 2026-01-21 00:59:10.294 [INFO][4761] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-n-af1f1f5a24' Jan 21 00:59:10.376314 containerd[1680]: 2026-01-21 00:59:10.301 [INFO][4761] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.e3db235c9c6620f661b39eb4cff14290d68413175cc1ecdcc7f4a25d0082493e" host="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:59:10.376314 containerd[1680]: 2026-01-21 00:59:10.305 [INFO][4761] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:59:10.376314 containerd[1680]: 2026-01-21 00:59:10.309 [INFO][4761] ipam/ipam.go 511: Trying affinity for 192.168.74.0/26 host="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:59:10.376314 containerd[1680]: 2026-01-21 00:59:10.310 [INFO][4761] ipam/ipam.go 158: Attempting to load block cidr=192.168.74.0/26 host="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:59:10.376314 containerd[1680]: 2026-01-21 00:59:10.312 [INFO][4761] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.74.0/26 host="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:59:10.376314 containerd[1680]: 2026-01-21 00:59:10.312 [INFO][4761] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.74.0/26 handle="k8s-pod-network.e3db235c9c6620f661b39eb4cff14290d68413175cc1ecdcc7f4a25d0082493e" host="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:59:10.376314 containerd[1680]: 2026-01-21 00:59:10.315 [INFO][4761] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.e3db235c9c6620f661b39eb4cff14290d68413175cc1ecdcc7f4a25d0082493e Jan 21 00:59:10.376314 containerd[1680]: 2026-01-21 00:59:10.319 [INFO][4761] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.74.0/26 handle="k8s-pod-network.e3db235c9c6620f661b39eb4cff14290d68413175cc1ecdcc7f4a25d0082493e" host="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:59:10.376314 containerd[1680]: 2026-01-21 00:59:10.326 [INFO][4761] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.74.7/26] block=192.168.74.0/26 handle="k8s-pod-network.e3db235c9c6620f661b39eb4cff14290d68413175cc1ecdcc7f4a25d0082493e" host="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:59:10.376314 containerd[1680]: 2026-01-21 00:59:10.327 [INFO][4761] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.74.7/26] handle="k8s-pod-network.e3db235c9c6620f661b39eb4cff14290d68413175cc1ecdcc7f4a25d0082493e" host="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:59:10.376314 containerd[1680]: 2026-01-21 00:59:10.327 [INFO][4761] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 21 00:59:10.376314 containerd[1680]: 2026-01-21 00:59:10.327 [INFO][4761] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.74.7/26] IPv6=[] ContainerID="e3db235c9c6620f661b39eb4cff14290d68413175cc1ecdcc7f4a25d0082493e" HandleID="k8s-pod-network.e3db235c9c6620f661b39eb4cff14290d68413175cc1ecdcc7f4a25d0082493e" Workload="ci--4547--0--0--n--af1f1f5a24-k8s-coredns--668d6bf9bc--78b44-eth0" Jan 21 00:59:10.378306 containerd[1680]: 2026-01-21 00:59:10.330 [INFO][4740] cni-plugin/k8s.go 418: Populated endpoint ContainerID="e3db235c9c6620f661b39eb4cff14290d68413175cc1ecdcc7f4a25d0082493e" Namespace="kube-system" Pod="coredns-668d6bf9bc-78b44" WorkloadEndpoint="ci--4547--0--0--n--af1f1f5a24-k8s-coredns--668d6bf9bc--78b44-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--n--af1f1f5a24-k8s-coredns--668d6bf9bc--78b44-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"9663badd-5b67-43ba-94ba-c1609d9bd7c0", ResourceVersion:"788", Generation:0, CreationTimestamp:time.Date(2026, time.January, 21, 0, 58, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-n-af1f1f5a24", ContainerID:"", Pod:"coredns-668d6bf9bc-78b44", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.74.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calia9aa0f3a790", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 21 00:59:10.378306 containerd[1680]: 2026-01-21 00:59:10.331 [INFO][4740] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.74.7/32] ContainerID="e3db235c9c6620f661b39eb4cff14290d68413175cc1ecdcc7f4a25d0082493e" Namespace="kube-system" Pod="coredns-668d6bf9bc-78b44" WorkloadEndpoint="ci--4547--0--0--n--af1f1f5a24-k8s-coredns--668d6bf9bc--78b44-eth0" Jan 21 00:59:10.378306 containerd[1680]: 2026-01-21 00:59:10.331 [INFO][4740] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia9aa0f3a790 ContainerID="e3db235c9c6620f661b39eb4cff14290d68413175cc1ecdcc7f4a25d0082493e" Namespace="kube-system" Pod="coredns-668d6bf9bc-78b44" WorkloadEndpoint="ci--4547--0--0--n--af1f1f5a24-k8s-coredns--668d6bf9bc--78b44-eth0" Jan 21 00:59:10.378306 containerd[1680]: 2026-01-21 00:59:10.358 [INFO][4740] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e3db235c9c6620f661b39eb4cff14290d68413175cc1ecdcc7f4a25d0082493e" Namespace="kube-system" Pod="coredns-668d6bf9bc-78b44" WorkloadEndpoint="ci--4547--0--0--n--af1f1f5a24-k8s-coredns--668d6bf9bc--78b44-eth0" Jan 21 00:59:10.378306 containerd[1680]: 2026-01-21 00:59:10.359 [INFO][4740] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="e3db235c9c6620f661b39eb4cff14290d68413175cc1ecdcc7f4a25d0082493e" Namespace="kube-system" Pod="coredns-668d6bf9bc-78b44" WorkloadEndpoint="ci--4547--0--0--n--af1f1f5a24-k8s-coredns--668d6bf9bc--78b44-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--n--af1f1f5a24-k8s-coredns--668d6bf9bc--78b44-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"9663badd-5b67-43ba-94ba-c1609d9bd7c0", ResourceVersion:"788", Generation:0, CreationTimestamp:time.Date(2026, time.January, 21, 0, 58, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-n-af1f1f5a24", ContainerID:"e3db235c9c6620f661b39eb4cff14290d68413175cc1ecdcc7f4a25d0082493e", Pod:"coredns-668d6bf9bc-78b44", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.74.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calia9aa0f3a790", MAC:"22:56:26:50:5c:8d", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 21 00:59:10.378306 containerd[1680]: 2026-01-21 00:59:10.370 [INFO][4740] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="e3db235c9c6620f661b39eb4cff14290d68413175cc1ecdcc7f4a25d0082493e" Namespace="kube-system" Pod="coredns-668d6bf9bc-78b44" WorkloadEndpoint="ci--4547--0--0--n--af1f1f5a24-k8s-coredns--668d6bf9bc--78b44-eth0" Jan 21 00:59:10.417701 containerd[1680]: time="2026-01-21T00:59:10.416603742Z" level=info msg="connecting to shim e3db235c9c6620f661b39eb4cff14290d68413175cc1ecdcc7f4a25d0082493e" address="unix:///run/containerd/s/08da20b01112edd108a97b80c0b982e2fa3f3c9122b0283469c40642b99e6b45" namespace=k8s.io protocol=ttrpc version=3 Jan 21 00:59:10.453000 audit[4809]: NETFILTER_CFG table=filter:138 family=2 entries=54 op=nft_register_chain pid=4809 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 21 00:59:10.453000 audit[4809]: SYSCALL arch=c000003e syscall=46 success=yes exit=25556 a0=3 a1=7ffdd553a5d0 a2=0 a3=7ffdd553a5bc items=0 ppid=4119 pid=4809 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:10.453000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 21 00:59:10.454866 systemd[1]: Started cri-containerd-e3db235c9c6620f661b39eb4cff14290d68413175cc1ecdcc7f4a25d0082493e.scope - libcontainer container e3db235c9c6620f661b39eb4cff14290d68413175cc1ecdcc7f4a25d0082493e. Jan 21 00:59:10.459389 systemd-networkd[1587]: calif4e36df4b9d: Link UP Jan 21 00:59:10.460427 systemd-networkd[1587]: calif4e36df4b9d: Gained carrier Jan 21 00:59:10.480000 audit: BPF prog-id=241 op=LOAD Jan 21 00:59:10.481000 audit: BPF prog-id=242 op=LOAD Jan 21 00:59:10.481000 audit[4801]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=4790 pid=4801 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:10.481000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6533646232333563396336363230663636316233396562346366663134 Jan 21 00:59:10.481000 audit: BPF prog-id=242 op=UNLOAD Jan 21 00:59:10.481000 audit[4801]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4790 pid=4801 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:10.481000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6533646232333563396336363230663636316233396562346366663134 Jan 21 00:59:10.481000 audit: BPF prog-id=243 op=LOAD Jan 21 00:59:10.481000 audit[4801]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=4790 pid=4801 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:10.481000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6533646232333563396336363230663636316233396562346366663134 Jan 21 00:59:10.481000 audit: BPF prog-id=244 op=LOAD Jan 21 00:59:10.481000 audit[4801]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=4790 pid=4801 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:10.481000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6533646232333563396336363230663636316233396562346366663134 Jan 21 00:59:10.481000 audit: BPF prog-id=244 op=UNLOAD Jan 21 00:59:10.481000 audit[4801]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=4790 pid=4801 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:10.481000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6533646232333563396336363230663636316233396562346366663134 Jan 21 00:59:10.481000 audit: BPF prog-id=243 op=UNLOAD Jan 21 00:59:10.481000 audit[4801]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4790 pid=4801 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:10.481000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6533646232333563396336363230663636316233396562346366663134 Jan 21 00:59:10.481000 audit: BPF prog-id=245 op=LOAD Jan 21 00:59:10.481000 audit[4801]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=4790 pid=4801 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:10.481000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6533646232333563396336363230663636316233396562346366663134 Jan 21 00:59:10.485560 containerd[1680]: 2026-01-21 00:59:10.257 [INFO][4736] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--n--af1f1f5a24-k8s-calico--apiserver--54cbb8896b--89fcv-eth0 calico-apiserver-54cbb8896b- calico-apiserver 557eba89-d604-4304-afae-f0e623ef8722 791 0 2026-01-21 00:58:40 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:54cbb8896b projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4547-0-0-n-af1f1f5a24 calico-apiserver-54cbb8896b-89fcv eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calif4e36df4b9d [] [] }} ContainerID="f5febc46c5b536ef39a5741a52bb77d0f7f425619041593d8e10b597e5149ca6" Namespace="calico-apiserver" Pod="calico-apiserver-54cbb8896b-89fcv" WorkloadEndpoint="ci--4547--0--0--n--af1f1f5a24-k8s-calico--apiserver--54cbb8896b--89fcv-" Jan 21 00:59:10.485560 containerd[1680]: 2026-01-21 00:59:10.258 [INFO][4736] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="f5febc46c5b536ef39a5741a52bb77d0f7f425619041593d8e10b597e5149ca6" Namespace="calico-apiserver" Pod="calico-apiserver-54cbb8896b-89fcv" WorkloadEndpoint="ci--4547--0--0--n--af1f1f5a24-k8s-calico--apiserver--54cbb8896b--89fcv-eth0" Jan 21 00:59:10.485560 containerd[1680]: 2026-01-21 00:59:10.301 [INFO][4759] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f5febc46c5b536ef39a5741a52bb77d0f7f425619041593d8e10b597e5149ca6" HandleID="k8s-pod-network.f5febc46c5b536ef39a5741a52bb77d0f7f425619041593d8e10b597e5149ca6" Workload="ci--4547--0--0--n--af1f1f5a24-k8s-calico--apiserver--54cbb8896b--89fcv-eth0" Jan 21 00:59:10.485560 containerd[1680]: 2026-01-21 00:59:10.302 [INFO][4759] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="f5febc46c5b536ef39a5741a52bb77d0f7f425619041593d8e10b597e5149ca6" HandleID="k8s-pod-network.f5febc46c5b536ef39a5741a52bb77d0f7f425619041593d8e10b597e5149ca6" Workload="ci--4547--0--0--n--af1f1f5a24-k8s-calico--apiserver--54cbb8896b--89fcv-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f590), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4547-0-0-n-af1f1f5a24", "pod":"calico-apiserver-54cbb8896b-89fcv", "timestamp":"2026-01-21 00:59:10.301909516 +0000 UTC"}, Hostname:"ci-4547-0-0-n-af1f1f5a24", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 21 00:59:10.485560 containerd[1680]: 2026-01-21 00:59:10.302 [INFO][4759] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 21 00:59:10.485560 containerd[1680]: 2026-01-21 00:59:10.327 [INFO][4759] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 21 00:59:10.485560 containerd[1680]: 2026-01-21 00:59:10.327 [INFO][4759] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-n-af1f1f5a24' Jan 21 00:59:10.485560 containerd[1680]: 2026-01-21 00:59:10.402 [INFO][4759] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.f5febc46c5b536ef39a5741a52bb77d0f7f425619041593d8e10b597e5149ca6" host="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:59:10.485560 containerd[1680]: 2026-01-21 00:59:10.412 [INFO][4759] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:59:10.485560 containerd[1680]: 2026-01-21 00:59:10.420 [INFO][4759] ipam/ipam.go 511: Trying affinity for 192.168.74.0/26 host="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:59:10.485560 containerd[1680]: 2026-01-21 00:59:10.425 [INFO][4759] ipam/ipam.go 158: Attempting to load block cidr=192.168.74.0/26 host="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:59:10.485560 containerd[1680]: 2026-01-21 00:59:10.431 [INFO][4759] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.74.0/26 host="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:59:10.485560 containerd[1680]: 2026-01-21 00:59:10.431 [INFO][4759] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.74.0/26 handle="k8s-pod-network.f5febc46c5b536ef39a5741a52bb77d0f7f425619041593d8e10b597e5149ca6" host="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:59:10.485560 containerd[1680]: 2026-01-21 00:59:10.433 [INFO][4759] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.f5febc46c5b536ef39a5741a52bb77d0f7f425619041593d8e10b597e5149ca6 Jan 21 00:59:10.485560 containerd[1680]: 2026-01-21 00:59:10.442 [INFO][4759] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.74.0/26 handle="k8s-pod-network.f5febc46c5b536ef39a5741a52bb77d0f7f425619041593d8e10b597e5149ca6" host="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:59:10.485560 containerd[1680]: 2026-01-21 00:59:10.453 [INFO][4759] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.74.8/26] block=192.168.74.0/26 handle="k8s-pod-network.f5febc46c5b536ef39a5741a52bb77d0f7f425619041593d8e10b597e5149ca6" host="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:59:10.485560 containerd[1680]: 2026-01-21 00:59:10.453 [INFO][4759] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.74.8/26] handle="k8s-pod-network.f5febc46c5b536ef39a5741a52bb77d0f7f425619041593d8e10b597e5149ca6" host="ci-4547-0-0-n-af1f1f5a24" Jan 21 00:59:10.485560 containerd[1680]: 2026-01-21 00:59:10.454 [INFO][4759] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 21 00:59:10.485560 containerd[1680]: 2026-01-21 00:59:10.454 [INFO][4759] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.74.8/26] IPv6=[] ContainerID="f5febc46c5b536ef39a5741a52bb77d0f7f425619041593d8e10b597e5149ca6" HandleID="k8s-pod-network.f5febc46c5b536ef39a5741a52bb77d0f7f425619041593d8e10b597e5149ca6" Workload="ci--4547--0--0--n--af1f1f5a24-k8s-calico--apiserver--54cbb8896b--89fcv-eth0" Jan 21 00:59:10.486163 containerd[1680]: 2026-01-21 00:59:10.456 [INFO][4736] cni-plugin/k8s.go 418: Populated endpoint ContainerID="f5febc46c5b536ef39a5741a52bb77d0f7f425619041593d8e10b597e5149ca6" Namespace="calico-apiserver" Pod="calico-apiserver-54cbb8896b-89fcv" WorkloadEndpoint="ci--4547--0--0--n--af1f1f5a24-k8s-calico--apiserver--54cbb8896b--89fcv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--n--af1f1f5a24-k8s-calico--apiserver--54cbb8896b--89fcv-eth0", GenerateName:"calico-apiserver-54cbb8896b-", Namespace:"calico-apiserver", SelfLink:"", UID:"557eba89-d604-4304-afae-f0e623ef8722", ResourceVersion:"791", Generation:0, CreationTimestamp:time.Date(2026, time.January, 21, 0, 58, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"54cbb8896b", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-n-af1f1f5a24", ContainerID:"", Pod:"calico-apiserver-54cbb8896b-89fcv", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.74.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calif4e36df4b9d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 21 00:59:10.486163 containerd[1680]: 2026-01-21 00:59:10.456 [INFO][4736] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.74.8/32] ContainerID="f5febc46c5b536ef39a5741a52bb77d0f7f425619041593d8e10b597e5149ca6" Namespace="calico-apiserver" Pod="calico-apiserver-54cbb8896b-89fcv" WorkloadEndpoint="ci--4547--0--0--n--af1f1f5a24-k8s-calico--apiserver--54cbb8896b--89fcv-eth0" Jan 21 00:59:10.486163 containerd[1680]: 2026-01-21 00:59:10.456 [INFO][4736] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif4e36df4b9d ContainerID="f5febc46c5b536ef39a5741a52bb77d0f7f425619041593d8e10b597e5149ca6" Namespace="calico-apiserver" Pod="calico-apiserver-54cbb8896b-89fcv" WorkloadEndpoint="ci--4547--0--0--n--af1f1f5a24-k8s-calico--apiserver--54cbb8896b--89fcv-eth0" Jan 21 00:59:10.486163 containerd[1680]: 2026-01-21 00:59:10.461 [INFO][4736] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f5febc46c5b536ef39a5741a52bb77d0f7f425619041593d8e10b597e5149ca6" Namespace="calico-apiserver" Pod="calico-apiserver-54cbb8896b-89fcv" WorkloadEndpoint="ci--4547--0--0--n--af1f1f5a24-k8s-calico--apiserver--54cbb8896b--89fcv-eth0" Jan 21 00:59:10.486163 containerd[1680]: 2026-01-21 00:59:10.461 [INFO][4736] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="f5febc46c5b536ef39a5741a52bb77d0f7f425619041593d8e10b597e5149ca6" Namespace="calico-apiserver" Pod="calico-apiserver-54cbb8896b-89fcv" WorkloadEndpoint="ci--4547--0--0--n--af1f1f5a24-k8s-calico--apiserver--54cbb8896b--89fcv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--n--af1f1f5a24-k8s-calico--apiserver--54cbb8896b--89fcv-eth0", GenerateName:"calico-apiserver-54cbb8896b-", Namespace:"calico-apiserver", SelfLink:"", UID:"557eba89-d604-4304-afae-f0e623ef8722", ResourceVersion:"791", Generation:0, CreationTimestamp:time.Date(2026, time.January, 21, 0, 58, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"54cbb8896b", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-n-af1f1f5a24", ContainerID:"f5febc46c5b536ef39a5741a52bb77d0f7f425619041593d8e10b597e5149ca6", Pod:"calico-apiserver-54cbb8896b-89fcv", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.74.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calif4e36df4b9d", MAC:"b6:95:6c:60:c1:99", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 21 00:59:10.486163 containerd[1680]: 2026-01-21 00:59:10.481 [INFO][4736] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="f5febc46c5b536ef39a5741a52bb77d0f7f425619041593d8e10b597e5149ca6" Namespace="calico-apiserver" Pod="calico-apiserver-54cbb8896b-89fcv" WorkloadEndpoint="ci--4547--0--0--n--af1f1f5a24-k8s-calico--apiserver--54cbb8896b--89fcv-eth0" Jan 21 00:59:10.519188 containerd[1680]: time="2026-01-21T00:59:10.519146256Z" level=info msg="connecting to shim f5febc46c5b536ef39a5741a52bb77d0f7f425619041593d8e10b597e5149ca6" address="unix:///run/containerd/s/33e5b0b39409817f2dce95ae8f853a1c6809266ff8a539b1366c2952ad57836e" namespace=k8s.io protocol=ttrpc version=3 Jan 21 00:59:10.540729 containerd[1680]: time="2026-01-21T00:59:10.540537337Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-78b44,Uid:9663badd-5b67-43ba-94ba-c1609d9bd7c0,Namespace:kube-system,Attempt:0,} returns sandbox id \"e3db235c9c6620f661b39eb4cff14290d68413175cc1ecdcc7f4a25d0082493e\"" Jan 21 00:59:10.546003 containerd[1680]: time="2026-01-21T00:59:10.545711736Z" level=info msg="CreateContainer within sandbox \"e3db235c9c6620f661b39eb4cff14290d68413175cc1ecdcc7f4a25d0082493e\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 21 00:59:10.570136 containerd[1680]: time="2026-01-21T00:59:10.570104652Z" level=info msg="Container 9e1c606c46ae54633265135d2eed4113f1f5e0b80dd781b5719035f2cbffa1c7: CDI devices from CRI Config.CDIDevices: []" Jan 21 00:59:10.570000 audit[4868]: NETFILTER_CFG table=filter:139 family=2 entries=53 op=nft_register_chain pid=4868 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 21 00:59:10.571925 systemd[1]: Started cri-containerd-f5febc46c5b536ef39a5741a52bb77d0f7f425619041593d8e10b597e5149ca6.scope - libcontainer container f5febc46c5b536ef39a5741a52bb77d0f7f425619041593d8e10b597e5149ca6. Jan 21 00:59:10.570000 audit[4868]: SYSCALL arch=c000003e syscall=46 success=yes exit=26608 a0=3 a1=7ffe17b73b70 a2=0 a3=7ffe17b73b5c items=0 ppid=4119 pid=4868 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:10.570000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 21 00:59:10.579882 containerd[1680]: time="2026-01-21T00:59:10.579849393Z" level=info msg="CreateContainer within sandbox \"e3db235c9c6620f661b39eb4cff14290d68413175cc1ecdcc7f4a25d0082493e\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"9e1c606c46ae54633265135d2eed4113f1f5e0b80dd781b5719035f2cbffa1c7\"" Jan 21 00:59:10.580887 containerd[1680]: time="2026-01-21T00:59:10.580862872Z" level=info msg="StartContainer for \"9e1c606c46ae54633265135d2eed4113f1f5e0b80dd781b5719035f2cbffa1c7\"" Jan 21 00:59:10.581722 containerd[1680]: time="2026-01-21T00:59:10.581701534Z" level=info msg="connecting to shim 9e1c606c46ae54633265135d2eed4113f1f5e0b80dd781b5719035f2cbffa1c7" address="unix:///run/containerd/s/08da20b01112edd108a97b80c0b982e2fa3f3c9122b0283469c40642b99e6b45" protocol=ttrpc version=3 Jan 21 00:59:10.595000 audit: BPF prog-id=246 op=LOAD Jan 21 00:59:10.597000 audit: BPF prog-id=247 op=LOAD Jan 21 00:59:10.597000 audit[4856]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=4837 pid=4856 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:10.597000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6635666562633436633562353336656633396135373431613532626237 Jan 21 00:59:10.597000 audit: BPF prog-id=247 op=UNLOAD Jan 21 00:59:10.597000 audit[4856]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4837 pid=4856 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:10.597000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6635666562633436633562353336656633396135373431613532626237 Jan 21 00:59:10.597000 audit: BPF prog-id=248 op=LOAD Jan 21 00:59:10.597000 audit[4856]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=4837 pid=4856 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:10.597000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6635666562633436633562353336656633396135373431613532626237 Jan 21 00:59:10.597000 audit: BPF prog-id=249 op=LOAD Jan 21 00:59:10.597000 audit[4856]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=4837 pid=4856 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:10.597000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6635666562633436633562353336656633396135373431613532626237 Jan 21 00:59:10.597000 audit: BPF prog-id=249 op=UNLOAD Jan 21 00:59:10.597000 audit[4856]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4837 pid=4856 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:10.597000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6635666562633436633562353336656633396135373431613532626237 Jan 21 00:59:10.597000 audit: BPF prog-id=248 op=UNLOAD Jan 21 00:59:10.597000 audit[4856]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4837 pid=4856 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:10.597000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6635666562633436633562353336656633396135373431613532626237 Jan 21 00:59:10.597000 audit: BPF prog-id=250 op=LOAD Jan 21 00:59:10.597000 audit[4856]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=4837 pid=4856 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:10.597000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6635666562633436633562353336656633396135373431613532626237 Jan 21 00:59:10.612432 systemd[1]: Started cri-containerd-9e1c606c46ae54633265135d2eed4113f1f5e0b80dd781b5719035f2cbffa1c7.scope - libcontainer container 9e1c606c46ae54633265135d2eed4113f1f5e0b80dd781b5719035f2cbffa1c7. Jan 21 00:59:10.623000 audit: BPF prog-id=251 op=LOAD Jan 21 00:59:10.624000 audit: BPF prog-id=252 op=LOAD Jan 21 00:59:10.624000 audit[4871]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a0238 a2=98 a3=0 items=0 ppid=4790 pid=4871 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:10.624000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3965316336303663343661653534363333323635313335643265656434 Jan 21 00:59:10.624000 audit: BPF prog-id=252 op=UNLOAD Jan 21 00:59:10.624000 audit[4871]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4790 pid=4871 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:10.624000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3965316336303663343661653534363333323635313335643265656434 Jan 21 00:59:10.624000 audit: BPF prog-id=253 op=LOAD Jan 21 00:59:10.624000 audit[4871]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a0488 a2=98 a3=0 items=0 ppid=4790 pid=4871 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:10.624000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3965316336303663343661653534363333323635313335643265656434 Jan 21 00:59:10.625000 audit: BPF prog-id=254 op=LOAD Jan 21 00:59:10.625000 audit[4871]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c0001a0218 a2=98 a3=0 items=0 ppid=4790 pid=4871 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:10.625000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3965316336303663343661653534363333323635313335643265656434 Jan 21 00:59:10.625000 audit: BPF prog-id=254 op=UNLOAD Jan 21 00:59:10.625000 audit[4871]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4790 pid=4871 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:10.625000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3965316336303663343661653534363333323635313335643265656434 Jan 21 00:59:10.625000 audit: BPF prog-id=253 op=UNLOAD Jan 21 00:59:10.625000 audit[4871]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4790 pid=4871 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:10.625000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3965316336303663343661653534363333323635313335643265656434 Jan 21 00:59:10.626000 audit: BPF prog-id=255 op=LOAD Jan 21 00:59:10.626000 audit[4871]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a06e8 a2=98 a3=0 items=0 ppid=4790 pid=4871 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:10.626000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3965316336303663343661653534363333323635313335643265656434 Jan 21 00:59:10.654433 containerd[1680]: time="2026-01-21T00:59:10.654395806Z" level=info msg="StartContainer for \"9e1c606c46ae54633265135d2eed4113f1f5e0b80dd781b5719035f2cbffa1c7\" returns successfully" Jan 21 00:59:10.661330 containerd[1680]: time="2026-01-21T00:59:10.661208039Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-54cbb8896b-89fcv,Uid:557eba89-d604-4304-afae-f0e623ef8722,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"f5febc46c5b536ef39a5741a52bb77d0f7f425619041593d8e10b597e5149ca6\"" Jan 21 00:59:10.663182 containerd[1680]: time="2026-01-21T00:59:10.663132124Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 21 00:59:11.007696 containerd[1680]: time="2026-01-21T00:59:11.007618175Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 21 00:59:11.010394 containerd[1680]: time="2026-01-21T00:59:11.010298631Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 21 00:59:11.010394 containerd[1680]: time="2026-01-21T00:59:11.010309433Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 21 00:59:11.010635 kubelet[2901]: E0121 00:59:11.010602 2901 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 21 00:59:11.010722 kubelet[2901]: E0121 00:59:11.010653 2901 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 21 00:59:11.010956 kubelet[2901]: E0121 00:59:11.010778 2901 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9hxfm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-54cbb8896b-89fcv_calico-apiserver(557eba89-d604-4304-afae-f0e623ef8722): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 21 00:59:11.012709 kubelet[2901]: E0121 00:59:11.012652 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-54cbb8896b-89fcv" podUID="557eba89-d604-4304-afae-f0e623ef8722" Jan 21 00:59:11.337106 kubelet[2901]: E0121 00:59:11.336620 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-54cbb8896b-89fcv" podUID="557eba89-d604-4304-afae-f0e623ef8722" Jan 21 00:59:11.350700 kubelet[2901]: I0121 00:59:11.349872 2901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-78b44" podStartSLOduration=40.349856445 podStartE2EDuration="40.349856445s" podCreationTimestamp="2026-01-21 00:58:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:59:11.347613331 +0000 UTC m=+46.288228927" watchObservedRunningTime="2026-01-21 00:59:11.349856445 +0000 UTC m=+46.290472060" Jan 21 00:59:11.376000 audit[4918]: NETFILTER_CFG table=filter:140 family=2 entries=14 op=nft_register_rule pid=4918 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 21 00:59:11.379148 kernel: kauditd_printk_skb: 233 callbacks suppressed Jan 21 00:59:11.379299 kernel: audit: type=1325 audit(1768957151.376:743): table=filter:140 family=2 entries=14 op=nft_register_rule pid=4918 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 21 00:59:11.376000 audit[4918]: SYSCALL arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7ffc4b5bd560 a2=0 a3=7ffc4b5bd54c items=0 ppid=3056 pid=4918 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:11.376000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 21 00:59:11.386789 kernel: audit: type=1300 audit(1768957151.376:743): arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7ffc4b5bd560 a2=0 a3=7ffc4b5bd54c items=0 ppid=3056 pid=4918 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:11.386870 kernel: audit: type=1327 audit(1768957151.376:743): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 21 00:59:11.386000 audit[4918]: NETFILTER_CFG table=nat:141 family=2 entries=44 op=nft_register_rule pid=4918 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 21 00:59:11.386000 audit[4918]: SYSCALL arch=c000003e syscall=46 success=yes exit=14196 a0=3 a1=7ffc4b5bd560 a2=0 a3=7ffc4b5bd54c items=0 ppid=3056 pid=4918 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:11.390796 kernel: audit: type=1325 audit(1768957151.386:744): table=nat:141 family=2 entries=44 op=nft_register_rule pid=4918 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 21 00:59:11.390832 kernel: audit: type=1300 audit(1768957151.386:744): arch=c000003e syscall=46 success=yes exit=14196 a0=3 a1=7ffc4b5bd560 a2=0 a3=7ffc4b5bd54c items=0 ppid=3056 pid=4918 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:11.386000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 21 00:59:11.394310 kernel: audit: type=1327 audit(1768957151.386:744): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 21 00:59:11.403000 audit[4920]: NETFILTER_CFG table=filter:142 family=2 entries=14 op=nft_register_rule pid=4920 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 21 00:59:11.403000 audit[4920]: SYSCALL arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7ffe8956e4a0 a2=0 a3=7ffe8956e48c items=0 ppid=3056 pid=4920 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:11.408025 kernel: audit: type=1325 audit(1768957151.403:745): table=filter:142 family=2 entries=14 op=nft_register_rule pid=4920 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 21 00:59:11.408155 kernel: audit: type=1300 audit(1768957151.403:745): arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7ffe8956e4a0 a2=0 a3=7ffe8956e48c items=0 ppid=3056 pid=4920 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:11.403000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 21 00:59:11.413017 kernel: audit: type=1327 audit(1768957151.403:745): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 21 00:59:11.424000 audit[4920]: NETFILTER_CFG table=nat:143 family=2 entries=56 op=nft_register_chain pid=4920 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 21 00:59:11.424000 audit[4920]: SYSCALL arch=c000003e syscall=46 success=yes exit=19860 a0=3 a1=7ffe8956e4a0 a2=0 a3=7ffe8956e48c items=0 ppid=3056 pid=4920 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 00:59:11.424000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 21 00:59:11.428730 kernel: audit: type=1325 audit(1768957151.424:746): table=nat:143 family=2 entries=56 op=nft_register_chain pid=4920 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 21 00:59:11.691843 systemd-networkd[1587]: calif4e36df4b9d: Gained IPv6LL Jan 21 00:59:12.204132 systemd-networkd[1587]: calia9aa0f3a790: Gained IPv6LL Jan 21 00:59:12.338386 kubelet[2901]: E0121 00:59:12.338356 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-54cbb8896b-89fcv" podUID="557eba89-d604-4304-afae-f0e623ef8722" Jan 21 00:59:16.212013 containerd[1680]: time="2026-01-21T00:59:16.211935343Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 21 00:59:16.571244 containerd[1680]: time="2026-01-21T00:59:16.571030251Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 21 00:59:16.573285 containerd[1680]: time="2026-01-21T00:59:16.573190998Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 21 00:59:16.573285 containerd[1680]: time="2026-01-21T00:59:16.573226670Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 21 00:59:16.573446 kubelet[2901]: E0121 00:59:16.573413 2901 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 21 00:59:16.573749 kubelet[2901]: E0121 00:59:16.573456 2901 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 21 00:59:16.573749 kubelet[2901]: E0121 00:59:16.573549 2901 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:8f6a270e2c9b4b408eb629752f61ddd6,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sl949,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-64874dbd99-cd26r_calico-system(9f88865d-a485-4a8a-b0f8-118c9593a73c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 21 00:59:16.576082 containerd[1680]: time="2026-01-21T00:59:16.576056248Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 21 00:59:16.906677 containerd[1680]: time="2026-01-21T00:59:16.906448757Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 21 00:59:16.908907 containerd[1680]: time="2026-01-21T00:59:16.908844984Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 21 00:59:16.909009 containerd[1680]: time="2026-01-21T00:59:16.908976793Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 21 00:59:16.909266 kubelet[2901]: E0121 00:59:16.909236 2901 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 21 00:59:16.909396 kubelet[2901]: E0121 00:59:16.909346 2901 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 21 00:59:16.909886 kubelet[2901]: E0121 00:59:16.909840 2901 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sl949,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-64874dbd99-cd26r_calico-system(9f88865d-a485-4a8a-b0f8-118c9593a73c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 21 00:59:16.911176 kubelet[2901]: E0121 00:59:16.911128 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-64874dbd99-cd26r" podUID="9f88865d-a485-4a8a-b0f8-118c9593a73c" Jan 21 00:59:20.212574 containerd[1680]: time="2026-01-21T00:59:20.212534856Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 21 00:59:20.544900 containerd[1680]: time="2026-01-21T00:59:20.544668119Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 21 00:59:20.546749 containerd[1680]: time="2026-01-21T00:59:20.546638260Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 21 00:59:20.546749 containerd[1680]: time="2026-01-21T00:59:20.546704201Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 21 00:59:20.546917 kubelet[2901]: E0121 00:59:20.546863 2901 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 21 00:59:20.547239 kubelet[2901]: E0121 00:59:20.546922 2901 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 21 00:59:20.547538 kubelet[2901]: E0121 00:59:20.547185 2901 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9dbdq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-54cbb8896b-hxx9z_calico-apiserver(22543a80-9d55-4110-b0da-aa35bd7688e7): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 21 00:59:20.548620 kubelet[2901]: E0121 00:59:20.548598 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-54cbb8896b-hxx9z" podUID="22543a80-9d55-4110-b0da-aa35bd7688e7" Jan 21 00:59:22.212413 containerd[1680]: time="2026-01-21T00:59:22.212034527Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 21 00:59:22.546325 containerd[1680]: time="2026-01-21T00:59:22.546199728Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 21 00:59:22.548451 containerd[1680]: time="2026-01-21T00:59:22.548411019Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 21 00:59:22.548751 containerd[1680]: time="2026-01-21T00:59:22.548490055Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 21 00:59:22.548784 kubelet[2901]: E0121 00:59:22.548612 2901 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 21 00:59:22.548784 kubelet[2901]: E0121 00:59:22.548655 2901 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 21 00:59:22.549071 kubelet[2901]: E0121 00:59:22.548895 2901 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kvxfg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-8lkb9_calico-system(851d5829-334a-4f46-97de-87be973a0b77): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 21 00:59:22.551928 containerd[1680]: time="2026-01-21T00:59:22.551908221Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 21 00:59:22.893882 containerd[1680]: time="2026-01-21T00:59:22.893756917Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 21 00:59:22.895714 containerd[1680]: time="2026-01-21T00:59:22.895310557Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 21 00:59:22.895714 containerd[1680]: time="2026-01-21T00:59:22.895405808Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 21 00:59:22.896614 kubelet[2901]: E0121 00:59:22.896408 2901 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 21 00:59:22.896614 kubelet[2901]: E0121 00:59:22.896457 2901 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 21 00:59:22.896614 kubelet[2901]: E0121 00:59:22.896569 2901 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kvxfg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-8lkb9_calico-system(851d5829-334a-4f46-97de-87be973a0b77): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 21 00:59:22.897796 kubelet[2901]: E0121 00:59:22.897739 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8lkb9" podUID="851d5829-334a-4f46-97de-87be973a0b77" Jan 21 00:59:23.213025 containerd[1680]: time="2026-01-21T00:59:23.212597650Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 21 00:59:23.561561 containerd[1680]: time="2026-01-21T00:59:23.561275738Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 21 00:59:23.562851 containerd[1680]: time="2026-01-21T00:59:23.562822603Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 21 00:59:23.562969 containerd[1680]: time="2026-01-21T00:59:23.562888369Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 21 00:59:23.563229 kubelet[2901]: E0121 00:59:23.563060 2901 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 21 00:59:23.563229 kubelet[2901]: E0121 00:59:23.563111 2901 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 21 00:59:23.563492 kubelet[2901]: E0121 00:59:23.563316 2901 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-grgbg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-95c8v_calico-system(be64252f-80c4-46f6-a3d4-52a6471b1a63): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 21 00:59:23.564010 containerd[1680]: time="2026-01-21T00:59:23.563760269Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 21 00:59:23.566304 kubelet[2901]: E0121 00:59:23.565734 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-95c8v" podUID="be64252f-80c4-46f6-a3d4-52a6471b1a63" Jan 21 00:59:23.901029 containerd[1680]: time="2026-01-21T00:59:23.900739190Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 21 00:59:23.903547 containerd[1680]: time="2026-01-21T00:59:23.903451330Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 21 00:59:23.903547 containerd[1680]: time="2026-01-21T00:59:23.903523278Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 21 00:59:23.903839 kubelet[2901]: E0121 00:59:23.903789 2901 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 21 00:59:23.903885 kubelet[2901]: E0121 00:59:23.903851 2901 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 21 00:59:23.904018 kubelet[2901]: E0121 00:59:23.903958 2901 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9hxfm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-54cbb8896b-89fcv_calico-apiserver(557eba89-d604-4304-afae-f0e623ef8722): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 21 00:59:23.905234 kubelet[2901]: E0121 00:59:23.905199 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-54cbb8896b-89fcv" podUID="557eba89-d604-4304-afae-f0e623ef8722" Jan 21 00:59:24.212299 containerd[1680]: time="2026-01-21T00:59:24.212154624Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 21 00:59:24.554152 containerd[1680]: time="2026-01-21T00:59:24.554018459Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 21 00:59:24.555972 containerd[1680]: time="2026-01-21T00:59:24.555926962Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 21 00:59:24.556045 containerd[1680]: time="2026-01-21T00:59:24.555998676Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 21 00:59:24.556666 kubelet[2901]: E0121 00:59:24.556144 2901 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 21 00:59:24.556666 kubelet[2901]: E0121 00:59:24.556194 2901 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 21 00:59:24.556666 kubelet[2901]: E0121 00:59:24.556309 2901 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q6z62,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-bc7877fd9-wf2hg_calico-system(59ce7fac-1a6e-4ec4-b99e-063ed3e3444c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 21 00:59:24.557759 kubelet[2901]: E0121 00:59:24.557736 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-bc7877fd9-wf2hg" podUID="59ce7fac-1a6e-4ec4-b99e-063ed3e3444c" Jan 21 00:59:28.213526 kubelet[2901]: E0121 00:59:28.213483 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-64874dbd99-cd26r" podUID="9f88865d-a485-4a8a-b0f8-118c9593a73c" Jan 21 00:59:35.212834 kubelet[2901]: E0121 00:59:35.212092 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-54cbb8896b-hxx9z" podUID="22543a80-9d55-4110-b0da-aa35bd7688e7" Jan 21 00:59:35.214329 kubelet[2901]: E0121 00:59:35.214305 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-95c8v" podUID="be64252f-80c4-46f6-a3d4-52a6471b1a63" Jan 21 00:59:35.214515 kubelet[2901]: E0121 00:59:35.214501 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-bc7877fd9-wf2hg" podUID="59ce7fac-1a6e-4ec4-b99e-063ed3e3444c" Jan 21 00:59:35.214631 kubelet[2901]: E0121 00:59:35.214614 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8lkb9" podUID="851d5829-334a-4f46-97de-87be973a0b77" Jan 21 00:59:36.211602 kubelet[2901]: E0121 00:59:36.211425 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-54cbb8896b-89fcv" podUID="557eba89-d604-4304-afae-f0e623ef8722" Jan 21 00:59:41.217765 containerd[1680]: time="2026-01-21T00:59:41.217667425Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 21 00:59:41.755070 containerd[1680]: time="2026-01-21T00:59:41.754995043Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 21 00:59:41.756948 containerd[1680]: time="2026-01-21T00:59:41.756856246Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 21 00:59:41.756948 containerd[1680]: time="2026-01-21T00:59:41.756932102Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 21 00:59:41.757549 kubelet[2901]: E0121 00:59:41.757130 2901 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 21 00:59:41.757549 kubelet[2901]: E0121 00:59:41.757170 2901 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 21 00:59:41.757549 kubelet[2901]: E0121 00:59:41.757263 2901 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:8f6a270e2c9b4b408eb629752f61ddd6,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sl949,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-64874dbd99-cd26r_calico-system(9f88865d-a485-4a8a-b0f8-118c9593a73c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 21 00:59:41.759489 containerd[1680]: time="2026-01-21T00:59:41.759397183Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 21 00:59:42.118839 containerd[1680]: time="2026-01-21T00:59:42.118351989Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 21 00:59:42.120352 containerd[1680]: time="2026-01-21T00:59:42.120257016Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 21 00:59:42.120352 containerd[1680]: time="2026-01-21T00:59:42.120287780Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 21 00:59:42.120698 kubelet[2901]: E0121 00:59:42.120654 2901 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 21 00:59:42.120774 kubelet[2901]: E0121 00:59:42.120758 2901 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 21 00:59:42.120937 kubelet[2901]: E0121 00:59:42.120909 2901 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sl949,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-64874dbd99-cd26r_calico-system(9f88865d-a485-4a8a-b0f8-118c9593a73c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 21 00:59:42.122675 kubelet[2901]: E0121 00:59:42.122142 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-64874dbd99-cd26r" podUID="9f88865d-a485-4a8a-b0f8-118c9593a73c" Jan 21 00:59:47.212629 containerd[1680]: time="2026-01-21T00:59:47.212166522Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 21 00:59:47.544790 containerd[1680]: time="2026-01-21T00:59:47.544228648Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 21 00:59:47.546392 containerd[1680]: time="2026-01-21T00:59:47.546270310Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 21 00:59:47.546392 containerd[1680]: time="2026-01-21T00:59:47.546362508Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 21 00:59:47.546561 kubelet[2901]: E0121 00:59:47.546519 2901 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 21 00:59:47.546949 kubelet[2901]: E0121 00:59:47.546573 2901 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 21 00:59:47.546949 kubelet[2901]: E0121 00:59:47.546711 2901 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9hxfm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-54cbb8896b-89fcv_calico-apiserver(557eba89-d604-4304-afae-f0e623ef8722): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 21 00:59:47.548161 kubelet[2901]: E0121 00:59:47.548132 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-54cbb8896b-89fcv" podUID="557eba89-d604-4304-afae-f0e623ef8722" Jan 21 00:59:48.211990 containerd[1680]: time="2026-01-21T00:59:48.211772577Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 21 00:59:48.553169 containerd[1680]: time="2026-01-21T00:59:48.552783232Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 21 00:59:48.555525 containerd[1680]: time="2026-01-21T00:59:48.555360410Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 21 00:59:48.555525 containerd[1680]: time="2026-01-21T00:59:48.555447320Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 21 00:59:48.555878 kubelet[2901]: E0121 00:59:48.555830 2901 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 21 00:59:48.556929 kubelet[2901]: E0121 00:59:48.555886 2901 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 21 00:59:48.556929 kubelet[2901]: E0121 00:59:48.556089 2901 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q6z62,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-bc7877fd9-wf2hg_calico-system(59ce7fac-1a6e-4ec4-b99e-063ed3e3444c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 21 00:59:48.557073 containerd[1680]: time="2026-01-21T00:59:48.556640845Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 21 00:59:48.557700 kubelet[2901]: E0121 00:59:48.557512 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-bc7877fd9-wf2hg" podUID="59ce7fac-1a6e-4ec4-b99e-063ed3e3444c" Jan 21 00:59:48.889706 containerd[1680]: time="2026-01-21T00:59:48.888794693Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 21 00:59:48.891892 containerd[1680]: time="2026-01-21T00:59:48.891834336Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 21 00:59:48.892359 containerd[1680]: time="2026-01-21T00:59:48.891878549Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 21 00:59:48.892477 kubelet[2901]: E0121 00:59:48.892451 2901 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 21 00:59:48.892577 kubelet[2901]: E0121 00:59:48.892566 2901 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 21 00:59:48.892857 kubelet[2901]: E0121 00:59:48.892799 2901 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9dbdq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-54cbb8896b-hxx9z_calico-apiserver(22543a80-9d55-4110-b0da-aa35bd7688e7): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 21 00:59:48.894919 kubelet[2901]: E0121 00:59:48.894856 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-54cbb8896b-hxx9z" podUID="22543a80-9d55-4110-b0da-aa35bd7688e7" Jan 21 00:59:50.212903 containerd[1680]: time="2026-01-21T00:59:50.212439551Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 21 00:59:50.567118 containerd[1680]: time="2026-01-21T00:59:50.566911789Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 21 00:59:50.568676 containerd[1680]: time="2026-01-21T00:59:50.568620725Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 21 00:59:50.568676 containerd[1680]: time="2026-01-21T00:59:50.568652425Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 21 00:59:50.568964 kubelet[2901]: E0121 00:59:50.568898 2901 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 21 00:59:50.568964 kubelet[2901]: E0121 00:59:50.568952 2901 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 21 00:59:50.569656 kubelet[2901]: E0121 00:59:50.569384 2901 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-grgbg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-95c8v_calico-system(be64252f-80c4-46f6-a3d4-52a6471b1a63): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 21 00:59:50.570031 containerd[1680]: time="2026-01-21T00:59:50.569994550Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 21 00:59:50.571236 kubelet[2901]: E0121 00:59:50.571200 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-95c8v" podUID="be64252f-80c4-46f6-a3d4-52a6471b1a63" Jan 21 00:59:50.919846 containerd[1680]: time="2026-01-21T00:59:50.919612754Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 21 00:59:50.921959 containerd[1680]: time="2026-01-21T00:59:50.921868742Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 21 00:59:50.921959 containerd[1680]: time="2026-01-21T00:59:50.921923084Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 21 00:59:50.922136 kubelet[2901]: E0121 00:59:50.922077 2901 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 21 00:59:50.923191 kubelet[2901]: E0121 00:59:50.922148 2901 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 21 00:59:50.923191 kubelet[2901]: E0121 00:59:50.922286 2901 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kvxfg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-8lkb9_calico-system(851d5829-334a-4f46-97de-87be973a0b77): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 21 00:59:50.923970 containerd[1680]: time="2026-01-21T00:59:50.923950657Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 21 00:59:51.238949 containerd[1680]: time="2026-01-21T00:59:51.238899549Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 21 00:59:51.241306 containerd[1680]: time="2026-01-21T00:59:51.241261372Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 21 00:59:51.241386 containerd[1680]: time="2026-01-21T00:59:51.241345318Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 21 00:59:51.241534 kubelet[2901]: E0121 00:59:51.241504 2901 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 21 00:59:51.241580 kubelet[2901]: E0121 00:59:51.241547 2901 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 21 00:59:51.241938 kubelet[2901]: E0121 00:59:51.241651 2901 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kvxfg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-8lkb9_calico-system(851d5829-334a-4f46-97de-87be973a0b77): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 21 00:59:51.244062 kubelet[2901]: E0121 00:59:51.244030 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8lkb9" podUID="851d5829-334a-4f46-97de-87be973a0b77" Jan 21 00:59:53.217507 kubelet[2901]: E0121 00:59:53.217451 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-64874dbd99-cd26r" podUID="9f88865d-a485-4a8a-b0f8-118c9593a73c" Jan 21 01:00:00.212604 kubelet[2901]: E0121 01:00:00.212567 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-54cbb8896b-89fcv" podUID="557eba89-d604-4304-afae-f0e623ef8722" Jan 21 01:00:02.212269 kubelet[2901]: E0121 01:00:02.212225 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-bc7877fd9-wf2hg" podUID="59ce7fac-1a6e-4ec4-b99e-063ed3e3444c" Jan 21 01:00:02.213969 kubelet[2901]: E0121 01:00:02.213933 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8lkb9" podUID="851d5829-334a-4f46-97de-87be973a0b77" Jan 21 01:00:03.212616 kubelet[2901]: E0121 01:00:03.212579 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-54cbb8896b-hxx9z" podUID="22543a80-9d55-4110-b0da-aa35bd7688e7" Jan 21 01:00:06.211790 kubelet[2901]: E0121 01:00:06.211663 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-95c8v" podUID="be64252f-80c4-46f6-a3d4-52a6471b1a63" Jan 21 01:00:08.213980 kubelet[2901]: E0121 01:00:08.213849 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-64874dbd99-cd26r" podUID="9f88865d-a485-4a8a-b0f8-118c9593a73c" Jan 21 01:00:14.215248 kubelet[2901]: E0121 01:00:14.215211 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-bc7877fd9-wf2hg" podUID="59ce7fac-1a6e-4ec4-b99e-063ed3e3444c" Jan 21 01:00:15.215137 kubelet[2901]: E0121 01:00:15.214865 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-54cbb8896b-89fcv" podUID="557eba89-d604-4304-afae-f0e623ef8722" Jan 21 01:00:16.211837 kubelet[2901]: E0121 01:00:16.211522 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-54cbb8896b-hxx9z" podUID="22543a80-9d55-4110-b0da-aa35bd7688e7" Jan 21 01:00:16.212706 kubelet[2901]: E0121 01:00:16.212599 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8lkb9" podUID="851d5829-334a-4f46-97de-87be973a0b77" Jan 21 01:00:19.214436 kubelet[2901]: E0121 01:00:19.214384 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-95c8v" podUID="be64252f-80c4-46f6-a3d4-52a6471b1a63" Jan 21 01:00:19.218319 kubelet[2901]: E0121 01:00:19.218260 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-64874dbd99-cd26r" podUID="9f88865d-a485-4a8a-b0f8-118c9593a73c" Jan 21 01:00:27.212425 kubelet[2901]: E0121 01:00:27.212377 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-54cbb8896b-hxx9z" podUID="22543a80-9d55-4110-b0da-aa35bd7688e7" Jan 21 01:00:28.213601 kubelet[2901]: E0121 01:00:28.213185 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-bc7877fd9-wf2hg" podUID="59ce7fac-1a6e-4ec4-b99e-063ed3e3444c" Jan 21 01:00:28.214340 containerd[1680]: time="2026-01-21T01:00:28.214313157Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 21 01:00:28.214841 kubelet[2901]: E0121 01:00:28.214764 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8lkb9" podUID="851d5829-334a-4f46-97de-87be973a0b77" Jan 21 01:00:28.567375 containerd[1680]: time="2026-01-21T01:00:28.567274807Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 21 01:00:28.569005 containerd[1680]: time="2026-01-21T01:00:28.568968715Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 21 01:00:28.569093 containerd[1680]: time="2026-01-21T01:00:28.569043076Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 21 01:00:28.569279 kubelet[2901]: E0121 01:00:28.569231 2901 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 21 01:00:28.569361 kubelet[2901]: E0121 01:00:28.569349 2901 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 21 01:00:28.570594 kubelet[2901]: E0121 01:00:28.569829 2901 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9hxfm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-54cbb8896b-89fcv_calico-apiserver(557eba89-d604-4304-afae-f0e623ef8722): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 21 01:00:28.571219 kubelet[2901]: E0121 01:00:28.571195 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-54cbb8896b-89fcv" podUID="557eba89-d604-4304-afae-f0e623ef8722" Jan 21 01:00:33.213757 containerd[1680]: time="2026-01-21T01:00:33.213388795Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 21 01:00:33.568345 containerd[1680]: time="2026-01-21T01:00:33.568239358Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 21 01:00:33.569785 containerd[1680]: time="2026-01-21T01:00:33.569743489Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 21 01:00:33.569935 containerd[1680]: time="2026-01-21T01:00:33.569815881Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 21 01:00:33.570080 kubelet[2901]: E0121 01:00:33.570046 2901 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 21 01:00:33.570728 kubelet[2901]: E0121 01:00:33.570351 2901 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 21 01:00:33.570728 kubelet[2901]: E0121 01:00:33.570452 2901 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:8f6a270e2c9b4b408eb629752f61ddd6,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sl949,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-64874dbd99-cd26r_calico-system(9f88865d-a485-4a8a-b0f8-118c9593a73c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 21 01:00:33.574063 containerd[1680]: time="2026-01-21T01:00:33.573864892Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 21 01:00:33.924189 containerd[1680]: time="2026-01-21T01:00:33.924097290Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 21 01:00:33.926007 containerd[1680]: time="2026-01-21T01:00:33.925967413Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 21 01:00:33.926116 containerd[1680]: time="2026-01-21T01:00:33.926044210Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 21 01:00:33.926781 kubelet[2901]: E0121 01:00:33.926711 2901 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 21 01:00:33.926781 kubelet[2901]: E0121 01:00:33.926758 2901 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 21 01:00:33.927783 kubelet[2901]: E0121 01:00:33.927031 2901 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sl949,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-64874dbd99-cd26r_calico-system(9f88865d-a485-4a8a-b0f8-118c9593a73c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 21 01:00:33.929073 kubelet[2901]: E0121 01:00:33.929034 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-64874dbd99-cd26r" podUID="9f88865d-a485-4a8a-b0f8-118c9593a73c" Jan 21 01:00:34.211699 containerd[1680]: time="2026-01-21T01:00:34.211462043Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 21 01:00:34.551236 containerd[1680]: time="2026-01-21T01:00:34.551134789Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 21 01:00:34.553266 containerd[1680]: time="2026-01-21T01:00:34.553157767Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 21 01:00:34.553266 containerd[1680]: time="2026-01-21T01:00:34.553238013Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 21 01:00:34.554751 kubelet[2901]: E0121 01:00:34.554712 2901 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 21 01:00:34.554819 kubelet[2901]: E0121 01:00:34.554762 2901 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 21 01:00:34.554912 kubelet[2901]: E0121 01:00:34.554875 2901 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-grgbg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-95c8v_calico-system(be64252f-80c4-46f6-a3d4-52a6471b1a63): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 21 01:00:34.556298 kubelet[2901]: E0121 01:00:34.556272 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-95c8v" podUID="be64252f-80c4-46f6-a3d4-52a6471b1a63" Jan 21 01:00:39.214066 containerd[1680]: time="2026-01-21T01:00:39.213767699Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 21 01:00:39.554890 containerd[1680]: time="2026-01-21T01:00:39.554581205Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 21 01:00:39.557710 containerd[1680]: time="2026-01-21T01:00:39.556585269Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 21 01:00:39.557853 containerd[1680]: time="2026-01-21T01:00:39.556713667Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 21 01:00:39.558174 kubelet[2901]: E0121 01:00:39.557978 2901 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 21 01:00:39.558174 kubelet[2901]: E0121 01:00:39.558024 2901 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 21 01:00:39.558174 kubelet[2901]: E0121 01:00:39.558140 2901 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9dbdq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-54cbb8896b-hxx9z_calico-apiserver(22543a80-9d55-4110-b0da-aa35bd7688e7): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 21 01:00:39.559952 kubelet[2901]: E0121 01:00:39.559876 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-54cbb8896b-hxx9z" podUID="22543a80-9d55-4110-b0da-aa35bd7688e7" Jan 21 01:00:42.065000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-10.0.5.74:22-4.153.228.146:44744 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 01:00:42.065911 systemd[1]: Started sshd@9-10.0.5.74:22-4.153.228.146:44744.service - OpenSSH per-connection server daemon (4.153.228.146:44744). Jan 21 01:00:42.067201 kernel: kauditd_printk_skb: 2 callbacks suppressed Jan 21 01:00:42.067270 kernel: audit: type=1130 audit(1768957242.065:747): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-10.0.5.74:22-4.153.228.146:44744 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 01:00:42.627703 kernel: audit: type=1101 audit(1768957242.621:748): pid=5078 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:00:42.621000 audit[5078]: USER_ACCT pid=5078 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:00:42.621000 audit[5078]: CRED_ACQ pid=5078 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:00:42.624058 sshd-session[5078]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 21 01:00:42.628136 sshd[5078]: Accepted publickey for core from 4.153.228.146 port 44744 ssh2: RSA SHA256:31xSJzLknPW0WXQ1/0HnwH4E7nhabzVasrxSAbB59go Jan 21 01:00:42.631768 kernel: audit: type=1103 audit(1768957242.621:749): pid=5078 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:00:42.635705 kernel: audit: type=1006 audit(1768957242.621:750): pid=5078 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=11 res=1 Jan 21 01:00:42.621000 audit[5078]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffd29dd26e0 a2=3 a3=0 items=0 ppid=1 pid=5078 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=11 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 01:00:42.638895 systemd-logind[1653]: New session 11 of user core. Jan 21 01:00:42.643144 kernel: audit: type=1300 audit(1768957242.621:750): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffd29dd26e0 a2=3 a3=0 items=0 ppid=1 pid=5078 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=11 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 01:00:42.643198 kernel: audit: type=1327 audit(1768957242.621:750): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 21 01:00:42.621000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 21 01:00:42.649225 systemd[1]: Started session-11.scope - Session 11 of User core. Jan 21 01:00:42.655000 audit[5078]: USER_START pid=5078 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:00:42.660878 kernel: audit: type=1105 audit(1768957242.655:751): pid=5078 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:00:42.660000 audit[5082]: CRED_ACQ pid=5082 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:00:42.664744 kernel: audit: type=1103 audit(1768957242.660:752): pid=5082 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:00:43.000677 sshd[5082]: Connection closed by 4.153.228.146 port 44744 Jan 21 01:00:43.001849 sshd-session[5078]: pam_unix(sshd:session): session closed for user core Jan 21 01:00:43.002000 audit[5078]: USER_END pid=5078 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:00:43.006010 systemd-logind[1653]: Session 11 logged out. Waiting for processes to exit. Jan 21 01:00:43.007135 systemd[1]: sshd@9-10.0.5.74:22-4.153.228.146:44744.service: Deactivated successfully. Jan 21 01:00:43.009983 kernel: audit: type=1106 audit(1768957243.002:753): pid=5078 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:00:43.010285 systemd[1]: session-11.scope: Deactivated successfully. Jan 21 01:00:43.002000 audit[5078]: CRED_DISP pid=5078 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:00:43.013260 systemd-logind[1653]: Removed session 11. Jan 21 01:00:43.006000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-10.0.5.74:22-4.153.228.146:44744 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 01:00:43.015709 kernel: audit: type=1104 audit(1768957243.002:754): pid=5078 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:00:43.214873 kubelet[2901]: E0121 01:00:43.214807 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-54cbb8896b-89fcv" podUID="557eba89-d604-4304-afae-f0e623ef8722" Jan 21 01:00:43.216086 containerd[1680]: time="2026-01-21T01:00:43.215791772Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 21 01:00:43.588705 containerd[1680]: time="2026-01-21T01:00:43.588532881Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 21 01:00:43.590972 containerd[1680]: time="2026-01-21T01:00:43.590853293Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 21 01:00:43.590972 containerd[1680]: time="2026-01-21T01:00:43.590899177Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 21 01:00:43.592882 kubelet[2901]: E0121 01:00:43.592822 2901 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 21 01:00:43.592975 kubelet[2901]: E0121 01:00:43.592894 2901 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 21 01:00:43.593105 kubelet[2901]: E0121 01:00:43.593061 2901 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q6z62,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-bc7877fd9-wf2hg_calico-system(59ce7fac-1a6e-4ec4-b99e-063ed3e3444c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 21 01:00:43.594932 containerd[1680]: time="2026-01-21T01:00:43.594833088Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 21 01:00:43.595024 kubelet[2901]: E0121 01:00:43.594881 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-bc7877fd9-wf2hg" podUID="59ce7fac-1a6e-4ec4-b99e-063ed3e3444c" Jan 21 01:00:43.969844 containerd[1680]: time="2026-01-21T01:00:43.969793978Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 21 01:00:43.971503 containerd[1680]: time="2026-01-21T01:00:43.971466561Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 21 01:00:43.971627 containerd[1680]: time="2026-01-21T01:00:43.971541687Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 21 01:00:43.971801 kubelet[2901]: E0121 01:00:43.971738 2901 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 21 01:00:43.971860 kubelet[2901]: E0121 01:00:43.971830 2901 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 21 01:00:43.972202 kubelet[2901]: E0121 01:00:43.972155 2901 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kvxfg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-8lkb9_calico-system(851d5829-334a-4f46-97de-87be973a0b77): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 21 01:00:43.974236 containerd[1680]: time="2026-01-21T01:00:43.974161051Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 21 01:00:44.342707 containerd[1680]: time="2026-01-21T01:00:44.340344281Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 21 01:00:44.342707 containerd[1680]: time="2026-01-21T01:00:44.342035823Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 21 01:00:44.342707 containerd[1680]: time="2026-01-21T01:00:44.342072841Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 21 01:00:44.343087 kubelet[2901]: E0121 01:00:44.342262 2901 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 21 01:00:44.343087 kubelet[2901]: E0121 01:00:44.342300 2901 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 21 01:00:44.343087 kubelet[2901]: E0121 01:00:44.342392 2901 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kvxfg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-8lkb9_calico-system(851d5829-334a-4f46-97de-87be973a0b77): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 21 01:00:44.343755 kubelet[2901]: E0121 01:00:44.343728 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8lkb9" podUID="851d5829-334a-4f46-97de-87be973a0b77" Jan 21 01:00:46.211727 kubelet[2901]: E0121 01:00:46.211268 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-95c8v" podUID="be64252f-80c4-46f6-a3d4-52a6471b1a63" Jan 21 01:00:47.214540 kubelet[2901]: E0121 01:00:47.214492 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-64874dbd99-cd26r" podUID="9f88865d-a485-4a8a-b0f8-118c9593a73c" Jan 21 01:00:48.107933 systemd[1]: Started sshd@10-10.0.5.74:22-4.153.228.146:58438.service - OpenSSH per-connection server daemon (4.153.228.146:58438). Jan 21 01:00:48.107000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-10.0.5.74:22-4.153.228.146:58438 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 01:00:48.108846 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 21 01:00:48.108886 kernel: audit: type=1130 audit(1768957248.107:756): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-10.0.5.74:22-4.153.228.146:58438 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 01:00:48.635000 audit[5096]: USER_ACCT pid=5096 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:00:48.636947 sshd[5096]: Accepted publickey for core from 4.153.228.146 port 58438 ssh2: RSA SHA256:31xSJzLknPW0WXQ1/0HnwH4E7nhabzVasrxSAbB59go Jan 21 01:00:48.641703 kernel: audit: type=1101 audit(1768957248.635:757): pid=5096 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:00:48.641000 audit[5096]: CRED_ACQ pid=5096 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:00:48.643470 sshd-session[5096]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 21 01:00:48.645706 kernel: audit: type=1103 audit(1768957248.641:758): pid=5096 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:00:48.648694 kernel: audit: type=1006 audit(1768957248.642:759): pid=5096 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=12 res=1 Jan 21 01:00:48.642000 audit[5096]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffcf8a75e30 a2=3 a3=0 items=0 ppid=1 pid=5096 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=12 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 01:00:48.642000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 21 01:00:48.654805 kernel: audit: type=1300 audit(1768957248.642:759): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffcf8a75e30 a2=3 a3=0 items=0 ppid=1 pid=5096 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=12 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 01:00:48.654855 kernel: audit: type=1327 audit(1768957248.642:759): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 21 01:00:48.656415 systemd-logind[1653]: New session 12 of user core. Jan 21 01:00:48.665161 systemd[1]: Started session-12.scope - Session 12 of User core. Jan 21 01:00:48.667000 audit[5096]: USER_START pid=5096 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:00:48.669000 audit[5100]: CRED_ACQ pid=5100 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:00:48.674350 kernel: audit: type=1105 audit(1768957248.667:760): pid=5096 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:00:48.674426 kernel: audit: type=1103 audit(1768957248.669:761): pid=5100 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:00:49.023749 sshd[5100]: Connection closed by 4.153.228.146 port 58438 Jan 21 01:00:49.025601 sshd-session[5096]: pam_unix(sshd:session): session closed for user core Jan 21 01:00:49.028000 audit[5096]: USER_END pid=5096 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:00:49.030824 systemd-logind[1653]: Session 12 logged out. Waiting for processes to exit. Jan 21 01:00:49.032672 systemd[1]: sshd@10-10.0.5.74:22-4.153.228.146:58438.service: Deactivated successfully. Jan 21 01:00:49.034724 kernel: audit: type=1106 audit(1768957249.028:762): pid=5096 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:00:49.036001 systemd[1]: session-12.scope: Deactivated successfully. Jan 21 01:00:49.028000 audit[5096]: CRED_DISP pid=5096 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:00:49.040662 systemd-logind[1653]: Removed session 12. Jan 21 01:00:49.041644 kernel: audit: type=1104 audit(1768957249.028:763): pid=5096 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:00:49.032000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-10.0.5.74:22-4.153.228.146:58438 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 01:00:51.213632 kubelet[2901]: E0121 01:00:51.213579 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-54cbb8896b-hxx9z" podUID="22543a80-9d55-4110-b0da-aa35bd7688e7" Jan 21 01:00:54.133880 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 21 01:00:54.133986 kernel: audit: type=1130 audit(1768957254.131:765): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-10.0.5.74:22-4.153.228.146:58450 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 01:00:54.131000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-10.0.5.74:22-4.153.228.146:58450 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 01:00:54.131952 systemd[1]: Started sshd@11-10.0.5.74:22-4.153.228.146:58450.service - OpenSSH per-connection server daemon (4.153.228.146:58450). Jan 21 01:00:54.653000 audit[5113]: USER_ACCT pid=5113 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:00:54.655706 sshd[5113]: Accepted publickey for core from 4.153.228.146 port 58450 ssh2: RSA SHA256:31xSJzLknPW0WXQ1/0HnwH4E7nhabzVasrxSAbB59go Jan 21 01:00:54.656570 sshd-session[5113]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 21 01:00:54.658974 kernel: audit: type=1101 audit(1768957254.653:766): pid=5113 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:00:54.654000 audit[5113]: CRED_ACQ pid=5113 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:00:54.662510 systemd-logind[1653]: New session 13 of user core. Jan 21 01:00:54.665190 kernel: audit: type=1103 audit(1768957254.654:767): pid=5113 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:00:54.665245 kernel: audit: type=1006 audit(1768957254.654:768): pid=5113 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=13 res=1 Jan 21 01:00:54.654000 audit[5113]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc46a81db0 a2=3 a3=0 items=0 ppid=1 pid=5113 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=13 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 01:00:54.668640 kernel: audit: type=1300 audit(1768957254.654:768): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc46a81db0 a2=3 a3=0 items=0 ppid=1 pid=5113 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=13 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 01:00:54.654000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 21 01:00:54.670908 systemd[1]: Started session-13.scope - Session 13 of User core. Jan 21 01:00:54.671809 kernel: audit: type=1327 audit(1768957254.654:768): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 21 01:00:54.679814 kernel: audit: type=1105 audit(1768957254.674:769): pid=5113 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:00:54.674000 audit[5113]: USER_START pid=5113 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:00:54.679000 audit[5117]: CRED_ACQ pid=5117 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:00:54.683704 kernel: audit: type=1103 audit(1768957254.679:770): pid=5117 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:00:55.003255 sshd[5117]: Connection closed by 4.153.228.146 port 58450 Jan 21 01:00:55.004045 sshd-session[5113]: pam_unix(sshd:session): session closed for user core Jan 21 01:00:55.005000 audit[5113]: USER_END pid=5113 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:00:55.009904 systemd[1]: sshd@11-10.0.5.74:22-4.153.228.146:58450.service: Deactivated successfully. Jan 21 01:00:55.011751 kernel: audit: type=1106 audit(1768957255.005:771): pid=5113 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:00:55.005000 audit[5113]: CRED_DISP pid=5113 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:00:55.012332 systemd[1]: session-13.scope: Deactivated successfully. Jan 21 01:00:55.014770 systemd-logind[1653]: Session 13 logged out. Waiting for processes to exit. Jan 21 01:00:55.009000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-10.0.5.74:22-4.153.228.146:58450 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 01:00:55.016220 kernel: audit: type=1104 audit(1768957255.005:772): pid=5113 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:00:55.016844 systemd-logind[1653]: Removed session 13. Jan 21 01:00:55.122000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-10.0.5.74:22-4.153.228.146:53490 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 01:00:55.123092 systemd[1]: Started sshd@12-10.0.5.74:22-4.153.228.146:53490.service - OpenSSH per-connection server daemon (4.153.228.146:53490). Jan 21 01:00:55.665000 audit[5129]: USER_ACCT pid=5129 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:00:55.666778 sshd[5129]: Accepted publickey for core from 4.153.228.146 port 53490 ssh2: RSA SHA256:31xSJzLknPW0WXQ1/0HnwH4E7nhabzVasrxSAbB59go Jan 21 01:00:55.668320 sshd-session[5129]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 21 01:00:55.666000 audit[5129]: CRED_ACQ pid=5129 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:00:55.666000 audit[5129]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc572434d0 a2=3 a3=0 items=0 ppid=1 pid=5129 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=14 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 01:00:55.666000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 21 01:00:55.676839 systemd-logind[1653]: New session 14 of user core. Jan 21 01:00:55.680276 systemd[1]: Started session-14.scope - Session 14 of User core. Jan 21 01:00:55.684000 audit[5129]: USER_START pid=5129 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:00:55.687000 audit[5133]: CRED_ACQ pid=5133 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:00:56.086741 sshd[5133]: Connection closed by 4.153.228.146 port 53490 Jan 21 01:00:56.087569 sshd-session[5129]: pam_unix(sshd:session): session closed for user core Jan 21 01:00:56.088000 audit[5129]: USER_END pid=5129 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:00:56.088000 audit[5129]: CRED_DISP pid=5129 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:00:56.090906 systemd[1]: sshd@12-10.0.5.74:22-4.153.228.146:53490.service: Deactivated successfully. Jan 21 01:00:56.090000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-10.0.5.74:22-4.153.228.146:53490 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 01:00:56.094244 systemd[1]: session-14.scope: Deactivated successfully. Jan 21 01:00:56.096747 systemd-logind[1653]: Session 14 logged out. Waiting for processes to exit. Jan 21 01:00:56.097484 systemd-logind[1653]: Removed session 14. Jan 21 01:00:56.195000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-10.0.5.74:22-4.153.228.146:53506 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 01:00:56.195927 systemd[1]: Started sshd@13-10.0.5.74:22-4.153.228.146:53506.service - OpenSSH per-connection server daemon (4.153.228.146:53506). Jan 21 01:00:56.211914 kubelet[2901]: E0121 01:00:56.211869 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-54cbb8896b-89fcv" podUID="557eba89-d604-4304-afae-f0e623ef8722" Jan 21 01:00:56.740000 audit[5146]: USER_ACCT pid=5146 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:00:56.741861 sshd[5146]: Accepted publickey for core from 4.153.228.146 port 53506 ssh2: RSA SHA256:31xSJzLknPW0WXQ1/0HnwH4E7nhabzVasrxSAbB59go Jan 21 01:00:56.741000 audit[5146]: CRED_ACQ pid=5146 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:00:56.741000 audit[5146]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc2d22ca70 a2=3 a3=0 items=0 ppid=1 pid=5146 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=15 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 01:00:56.741000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 21 01:00:56.742839 sshd-session[5146]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 21 01:00:56.749829 systemd-logind[1653]: New session 15 of user core. Jan 21 01:00:56.756927 systemd[1]: Started session-15.scope - Session 15 of User core. Jan 21 01:00:56.760000 audit[5146]: USER_START pid=5146 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:00:56.763000 audit[5151]: CRED_ACQ pid=5151 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:00:57.113265 sshd[5151]: Connection closed by 4.153.228.146 port 53506 Jan 21 01:00:57.113405 sshd-session[5146]: pam_unix(sshd:session): session closed for user core Jan 21 01:00:57.117000 audit[5146]: USER_END pid=5146 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:00:57.117000 audit[5146]: CRED_DISP pid=5146 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:00:57.119776 systemd-logind[1653]: Session 15 logged out. Waiting for processes to exit. Jan 21 01:00:57.122354 systemd[1]: sshd@13-10.0.5.74:22-4.153.228.146:53506.service: Deactivated successfully. Jan 21 01:00:57.121000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-10.0.5.74:22-4.153.228.146:53506 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 01:00:57.125248 systemd[1]: session-15.scope: Deactivated successfully. Jan 21 01:00:57.127299 systemd-logind[1653]: Removed session 15. Jan 21 01:00:58.213981 kubelet[2901]: E0121 01:00:58.213712 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-bc7877fd9-wf2hg" podUID="59ce7fac-1a6e-4ec4-b99e-063ed3e3444c" Jan 21 01:00:58.214429 kubelet[2901]: E0121 01:00:58.214027 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-95c8v" podUID="be64252f-80c4-46f6-a3d4-52a6471b1a63" Jan 21 01:00:58.214822 kubelet[2901]: E0121 01:00:58.213800 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8lkb9" podUID="851d5829-334a-4f46-97de-87be973a0b77" Jan 21 01:01:01.214539 kubelet[2901]: E0121 01:01:01.214498 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-64874dbd99-cd26r" podUID="9f88865d-a485-4a8a-b0f8-118c9593a73c" Jan 21 01:01:02.211867 kubelet[2901]: E0121 01:01:02.211822 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-54cbb8896b-hxx9z" podUID="22543a80-9d55-4110-b0da-aa35bd7688e7" Jan 21 01:01:02.227000 systemd[1]: Started sshd@14-10.0.5.74:22-4.153.228.146:53516.service - OpenSSH per-connection server daemon (4.153.228.146:53516). Jan 21 01:01:02.228625 kernel: kauditd_printk_skb: 23 callbacks suppressed Jan 21 01:01:02.228699 kernel: audit: type=1130 audit(1768957262.226:792): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-10.0.5.74:22-4.153.228.146:53516 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 01:01:02.226000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-10.0.5.74:22-4.153.228.146:53516 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 01:01:02.778000 audit[5166]: USER_ACCT pid=5166 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:02.779509 sshd[5166]: Accepted publickey for core from 4.153.228.146 port 53516 ssh2: RSA SHA256:31xSJzLknPW0WXQ1/0HnwH4E7nhabzVasrxSAbB59go Jan 21 01:01:02.783487 sshd-session[5166]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 21 01:01:02.780000 audit[5166]: CRED_ACQ pid=5166 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:02.785350 kernel: audit: type=1101 audit(1768957262.778:793): pid=5166 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:02.785407 kernel: audit: type=1103 audit(1768957262.780:794): pid=5166 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:02.790424 systemd-logind[1653]: New session 16 of user core. Jan 21 01:01:02.791769 kernel: audit: type=1006 audit(1768957262.780:795): pid=5166 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=16 res=1 Jan 21 01:01:02.780000 audit[5166]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff7aff8be0 a2=3 a3=0 items=0 ppid=1 pid=5166 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=16 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 01:01:02.798713 kernel: audit: type=1300 audit(1768957262.780:795): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff7aff8be0 a2=3 a3=0 items=0 ppid=1 pid=5166 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=16 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 01:01:02.798913 systemd[1]: Started session-16.scope - Session 16 of User core. Jan 21 01:01:02.780000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 21 01:01:02.801000 audit[5166]: USER_START pid=5166 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:02.806893 kernel: audit: type=1327 audit(1768957262.780:795): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 21 01:01:02.806946 kernel: audit: type=1105 audit(1768957262.801:796): pid=5166 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:02.805000 audit[5195]: CRED_ACQ pid=5195 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:02.811255 kernel: audit: type=1103 audit(1768957262.805:797): pid=5195 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:03.148618 sshd[5195]: Connection closed by 4.153.228.146 port 53516 Jan 21 01:01:03.150694 sshd-session[5166]: pam_unix(sshd:session): session closed for user core Jan 21 01:01:03.151000 audit[5166]: USER_END pid=5166 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:03.154067 systemd[1]: sshd@14-10.0.5.74:22-4.153.228.146:53516.service: Deactivated successfully. Jan 21 01:01:03.151000 audit[5166]: CRED_DISP pid=5166 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:03.159339 systemd[1]: session-16.scope: Deactivated successfully. Jan 21 01:01:03.160315 kernel: audit: type=1106 audit(1768957263.151:798): pid=5166 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:03.160377 kernel: audit: type=1104 audit(1768957263.151:799): pid=5166 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:03.161967 systemd-logind[1653]: Session 16 logged out. Waiting for processes to exit. Jan 21 01:01:03.153000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-10.0.5.74:22-4.153.228.146:53516 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 01:01:03.163085 systemd-logind[1653]: Removed session 16. Jan 21 01:01:03.256000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-10.0.5.74:22-4.153.228.146:53528 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 01:01:03.257243 systemd[1]: Started sshd@15-10.0.5.74:22-4.153.228.146:53528.service - OpenSSH per-connection server daemon (4.153.228.146:53528). Jan 21 01:01:03.787000 audit[5207]: USER_ACCT pid=5207 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:03.788071 sshd[5207]: Accepted publickey for core from 4.153.228.146 port 53528 ssh2: RSA SHA256:31xSJzLknPW0WXQ1/0HnwH4E7nhabzVasrxSAbB59go Jan 21 01:01:03.788000 audit[5207]: CRED_ACQ pid=5207 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:03.788000 audit[5207]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffda5f08820 a2=3 a3=0 items=0 ppid=1 pid=5207 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=17 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 01:01:03.788000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 21 01:01:03.790666 sshd-session[5207]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 21 01:01:03.796487 systemd-logind[1653]: New session 17 of user core. Jan 21 01:01:03.802867 systemd[1]: Started session-17.scope - Session 17 of User core. Jan 21 01:01:03.806000 audit[5207]: USER_START pid=5207 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:03.808000 audit[5211]: CRED_ACQ pid=5211 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:04.462268 sshd[5211]: Connection closed by 4.153.228.146 port 53528 Jan 21 01:01:04.461532 sshd-session[5207]: pam_unix(sshd:session): session closed for user core Jan 21 01:01:04.463000 audit[5207]: USER_END pid=5207 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:04.463000 audit[5207]: CRED_DISP pid=5207 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:04.466731 systemd[1]: sshd@15-10.0.5.74:22-4.153.228.146:53528.service: Deactivated successfully. Jan 21 01:01:04.466000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-10.0.5.74:22-4.153.228.146:53528 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 01:01:04.469167 systemd[1]: session-17.scope: Deactivated successfully. Jan 21 01:01:04.472270 systemd-logind[1653]: Session 17 logged out. Waiting for processes to exit. Jan 21 01:01:04.473424 systemd-logind[1653]: Removed session 17. Jan 21 01:01:04.569000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-10.0.5.74:22-4.153.228.146:48486 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 01:01:04.569916 systemd[1]: Started sshd@16-10.0.5.74:22-4.153.228.146:48486.service - OpenSSH per-connection server daemon (4.153.228.146:48486). Jan 21 01:01:05.130000 audit[5221]: USER_ACCT pid=5221 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:05.130987 sshd[5221]: Accepted publickey for core from 4.153.228.146 port 48486 ssh2: RSA SHA256:31xSJzLknPW0WXQ1/0HnwH4E7nhabzVasrxSAbB59go Jan 21 01:01:05.131000 audit[5221]: CRED_ACQ pid=5221 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:05.131000 audit[5221]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe569736a0 a2=3 a3=0 items=0 ppid=1 pid=5221 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=18 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 01:01:05.131000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 21 01:01:05.132953 sshd-session[5221]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 21 01:01:05.140726 systemd-logind[1653]: New session 18 of user core. Jan 21 01:01:05.143831 systemd[1]: Started session-18.scope - Session 18 of User core. Jan 21 01:01:05.146000 audit[5221]: USER_START pid=5221 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:05.148000 audit[5225]: CRED_ACQ pid=5225 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:05.978000 audit[5235]: NETFILTER_CFG table=filter:144 family=2 entries=26 op=nft_register_rule pid=5235 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 21 01:01:05.978000 audit[5235]: SYSCALL arch=c000003e syscall=46 success=yes exit=14176 a0=3 a1=7ffd70950b10 a2=0 a3=7ffd70950afc items=0 ppid=3056 pid=5235 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 01:01:05.978000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 21 01:01:05.981000 audit[5235]: NETFILTER_CFG table=nat:145 family=2 entries=20 op=nft_register_rule pid=5235 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 21 01:01:05.981000 audit[5235]: SYSCALL arch=c000003e syscall=46 success=yes exit=5772 a0=3 a1=7ffd70950b10 a2=0 a3=0 items=0 ppid=3056 pid=5235 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 01:01:05.981000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 21 01:01:06.090026 sshd[5225]: Connection closed by 4.153.228.146 port 48486 Jan 21 01:01:06.090757 sshd-session[5221]: pam_unix(sshd:session): session closed for user core Jan 21 01:01:06.092000 audit[5221]: USER_END pid=5221 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:06.093000 audit[5237]: NETFILTER_CFG table=filter:146 family=2 entries=38 op=nft_register_rule pid=5237 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 21 01:01:06.093000 audit[5237]: SYSCALL arch=c000003e syscall=46 success=yes exit=14176 a0=3 a1=7ffeaa8a9b10 a2=0 a3=7ffeaa8a9afc items=0 ppid=3056 pid=5237 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 01:01:06.093000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 21 01:01:06.094000 audit[5221]: CRED_DISP pid=5221 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:06.097509 systemd[1]: sshd@16-10.0.5.74:22-4.153.228.146:48486.service: Deactivated successfully. Jan 21 01:01:06.097000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-10.0.5.74:22-4.153.228.146:48486 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 01:01:06.099639 systemd[1]: session-18.scope: Deactivated successfully. Jan 21 01:01:06.099000 audit[5237]: NETFILTER_CFG table=nat:147 family=2 entries=20 op=nft_register_rule pid=5237 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 21 01:01:06.101894 systemd-logind[1653]: Session 18 logged out. Waiting for processes to exit. Jan 21 01:01:06.099000 audit[5237]: SYSCALL arch=c000003e syscall=46 success=yes exit=5772 a0=3 a1=7ffeaa8a9b10 a2=0 a3=0 items=0 ppid=3056 pid=5237 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 01:01:06.099000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 21 01:01:06.104945 systemd-logind[1653]: Removed session 18. Jan 21 01:01:06.202963 systemd[1]: Started sshd@17-10.0.5.74:22-4.153.228.146:48500.service - OpenSSH per-connection server daemon (4.153.228.146:48500). Jan 21 01:01:06.202000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-10.0.5.74:22-4.153.228.146:48500 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 01:01:06.743000 audit[5242]: USER_ACCT pid=5242 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:06.746966 sshd[5242]: Accepted publickey for core from 4.153.228.146 port 48500 ssh2: RSA SHA256:31xSJzLknPW0WXQ1/0HnwH4E7nhabzVasrxSAbB59go Jan 21 01:01:06.747000 audit[5242]: CRED_ACQ pid=5242 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:06.748000 audit[5242]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffd3edb6120 a2=3 a3=0 items=0 ppid=1 pid=5242 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=19 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 01:01:06.748000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 21 01:01:06.749520 sshd-session[5242]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 21 01:01:06.755040 systemd-logind[1653]: New session 19 of user core. Jan 21 01:01:06.759903 systemd[1]: Started session-19.scope - Session 19 of User core. Jan 21 01:01:06.762000 audit[5242]: USER_START pid=5242 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:06.765000 audit[5246]: CRED_ACQ pid=5246 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:07.249822 sshd[5246]: Connection closed by 4.153.228.146 port 48500 Jan 21 01:01:07.253535 sshd-session[5242]: pam_unix(sshd:session): session closed for user core Jan 21 01:01:07.260818 kernel: kauditd_printk_skb: 43 callbacks suppressed Jan 21 01:01:07.260924 kernel: audit: type=1106 audit(1768957267.253:829): pid=5242 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:07.253000 audit[5242]: USER_END pid=5242 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:07.264406 systemd-logind[1653]: Session 19 logged out. Waiting for processes to exit. Jan 21 01:01:07.264770 systemd[1]: sshd@17-10.0.5.74:22-4.153.228.146:48500.service: Deactivated successfully. Jan 21 01:01:07.267692 systemd[1]: session-19.scope: Deactivated successfully. Jan 21 01:01:07.271814 kernel: audit: type=1104 audit(1768957267.260:830): pid=5242 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:07.260000 audit[5242]: CRED_DISP pid=5242 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:07.264000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-10.0.5.74:22-4.153.228.146:48500 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 01:01:07.275626 systemd-logind[1653]: Removed session 19. Jan 21 01:01:07.276886 kernel: audit: type=1131 audit(1768957267.264:831): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-10.0.5.74:22-4.153.228.146:48500 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 01:01:07.361259 systemd[1]: Started sshd@18-10.0.5.74:22-4.153.228.146:48512.service - OpenSSH per-connection server daemon (4.153.228.146:48512). Jan 21 01:01:07.360000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-10.0.5.74:22-4.153.228.146:48512 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 01:01:07.365729 kernel: audit: type=1130 audit(1768957267.360:832): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-10.0.5.74:22-4.153.228.146:48512 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 01:01:07.913000 audit[5256]: USER_ACCT pid=5256 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:07.915882 sshd[5256]: Accepted publickey for core from 4.153.228.146 port 48512 ssh2: RSA SHA256:31xSJzLknPW0WXQ1/0HnwH4E7nhabzVasrxSAbB59go Jan 21 01:01:07.918125 sshd-session[5256]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 21 01:01:07.915000 audit[5256]: CRED_ACQ pid=5256 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:07.920477 kernel: audit: type=1101 audit(1768957267.913:833): pid=5256 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:07.920539 kernel: audit: type=1103 audit(1768957267.915:834): pid=5256 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:07.924493 kernel: audit: type=1006 audit(1768957267.915:835): pid=5256 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=20 res=1 Jan 21 01:01:07.915000 audit[5256]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffd090fd360 a2=3 a3=0 items=0 ppid=1 pid=5256 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=20 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 01:01:07.928205 kernel: audit: type=1300 audit(1768957267.915:835): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffd090fd360 a2=3 a3=0 items=0 ppid=1 pid=5256 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=20 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 01:01:07.928762 systemd-logind[1653]: New session 20 of user core. Jan 21 01:01:07.915000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 21 01:01:07.932454 kernel: audit: type=1327 audit(1768957267.915:835): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 21 01:01:07.934875 systemd[1]: Started session-20.scope - Session 20 of User core. Jan 21 01:01:07.937000 audit[5256]: USER_START pid=5256 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:07.943710 kernel: audit: type=1105 audit(1768957267.937:836): pid=5256 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:07.943000 audit[5260]: CRED_ACQ pid=5260 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:08.210915 kubelet[2901]: E0121 01:01:08.210885 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-54cbb8896b-89fcv" podUID="557eba89-d604-4304-afae-f0e623ef8722" Jan 21 01:01:08.315378 sshd[5260]: Connection closed by 4.153.228.146 port 48512 Jan 21 01:01:08.316626 sshd-session[5256]: pam_unix(sshd:session): session closed for user core Jan 21 01:01:08.317000 audit[5256]: USER_END pid=5256 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:08.317000 audit[5256]: CRED_DISP pid=5256 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:08.320303 systemd-logind[1653]: Session 20 logged out. Waiting for processes to exit. Jan 21 01:01:08.322981 systemd[1]: sshd@18-10.0.5.74:22-4.153.228.146:48512.service: Deactivated successfully. Jan 21 01:01:08.322000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-10.0.5.74:22-4.153.228.146:48512 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 01:01:08.324838 systemd[1]: session-20.scope: Deactivated successfully. Jan 21 01:01:08.326572 systemd-logind[1653]: Removed session 20. Jan 21 01:01:09.212111 kubelet[2901]: E0121 01:01:09.212044 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-bc7877fd9-wf2hg" podUID="59ce7fac-1a6e-4ec4-b99e-063ed3e3444c" Jan 21 01:01:10.412000 audit[5271]: NETFILTER_CFG table=filter:148 family=2 entries=26 op=nft_register_rule pid=5271 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 21 01:01:10.412000 audit[5271]: SYSCALL arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7ffff9599330 a2=0 a3=7ffff959931c items=0 ppid=3056 pid=5271 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 01:01:10.412000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 21 01:01:10.417000 audit[5271]: NETFILTER_CFG table=nat:149 family=2 entries=104 op=nft_register_chain pid=5271 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 21 01:01:10.417000 audit[5271]: SYSCALL arch=c000003e syscall=46 success=yes exit=48684 a0=3 a1=7ffff9599330 a2=0 a3=7ffff959931c items=0 ppid=3056 pid=5271 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 01:01:10.417000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 21 01:01:11.214426 kubelet[2901]: E0121 01:01:11.214298 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-95c8v" podUID="be64252f-80c4-46f6-a3d4-52a6471b1a63" Jan 21 01:01:13.213583 kubelet[2901]: E0121 01:01:13.213335 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-54cbb8896b-hxx9z" podUID="22543a80-9d55-4110-b0da-aa35bd7688e7" Jan 21 01:01:13.214393 kubelet[2901]: E0121 01:01:13.214295 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-64874dbd99-cd26r" podUID="9f88865d-a485-4a8a-b0f8-118c9593a73c" Jan 21 01:01:13.214393 kubelet[2901]: E0121 01:01:13.214360 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8lkb9" podUID="851d5829-334a-4f46-97de-87be973a0b77" Jan 21 01:01:13.429200 systemd[1]: Started sshd@19-10.0.5.74:22-4.153.228.146:48520.service - OpenSSH per-connection server daemon (4.153.228.146:48520). Jan 21 01:01:13.432418 kernel: kauditd_printk_skb: 10 callbacks suppressed Jan 21 01:01:13.432475 kernel: audit: type=1130 audit(1768957273.428:843): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-10.0.5.74:22-4.153.228.146:48520 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 01:01:13.428000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-10.0.5.74:22-4.153.228.146:48520 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 01:01:13.976000 audit[5273]: USER_ACCT pid=5273 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:13.983706 kernel: audit: type=1101 audit(1768957273.976:844): pid=5273 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:13.983824 sshd[5273]: Accepted publickey for core from 4.153.228.146 port 48520 ssh2: RSA SHA256:31xSJzLknPW0WXQ1/0HnwH4E7nhabzVasrxSAbB59go Jan 21 01:01:13.982000 audit[5273]: CRED_ACQ pid=5273 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:13.985432 sshd-session[5273]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 21 01:01:13.988762 kernel: audit: type=1103 audit(1768957273.982:845): pid=5273 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:13.988993 kernel: audit: type=1006 audit(1768957273.983:846): pid=5273 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=21 res=1 Jan 21 01:01:13.983000 audit[5273]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe28fe24f0 a2=3 a3=0 items=0 ppid=1 pid=5273 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=21 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 01:01:13.996697 kernel: audit: type=1300 audit(1768957273.983:846): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe28fe24f0 a2=3 a3=0 items=0 ppid=1 pid=5273 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=21 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 01:01:13.983000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 21 01:01:13.999694 kernel: audit: type=1327 audit(1768957273.983:846): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 21 01:01:14.001347 systemd-logind[1653]: New session 21 of user core. Jan 21 01:01:14.006887 systemd[1]: Started session-21.scope - Session 21 of User core. Jan 21 01:01:14.009000 audit[5273]: USER_START pid=5273 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:14.016819 kernel: audit: type=1105 audit(1768957274.009:847): pid=5273 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:14.015000 audit[5277]: CRED_ACQ pid=5277 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:14.020703 kernel: audit: type=1103 audit(1768957274.015:848): pid=5277 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:14.357039 sshd[5277]: Connection closed by 4.153.228.146 port 48520 Jan 21 01:01:14.356916 sshd-session[5273]: pam_unix(sshd:session): session closed for user core Jan 21 01:01:14.357000 audit[5273]: USER_END pid=5273 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:14.361217 systemd-logind[1653]: Session 21 logged out. Waiting for processes to exit. Jan 21 01:01:14.362644 systemd[1]: sshd@19-10.0.5.74:22-4.153.228.146:48520.service: Deactivated successfully. Jan 21 01:01:14.364919 systemd[1]: session-21.scope: Deactivated successfully. Jan 21 01:01:14.357000 audit[5273]: CRED_DISP pid=5273 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:14.367274 kernel: audit: type=1106 audit(1768957274.357:849): pid=5273 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:14.367336 kernel: audit: type=1104 audit(1768957274.357:850): pid=5273 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:14.367525 systemd-logind[1653]: Removed session 21. Jan 21 01:01:14.361000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-10.0.5.74:22-4.153.228.146:48520 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 01:01:19.461000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-10.0.5.74:22-4.153.228.146:44466 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 01:01:19.463249 systemd[1]: Started sshd@20-10.0.5.74:22-4.153.228.146:44466.service - OpenSSH per-connection server daemon (4.153.228.146:44466). Jan 21 01:01:19.465184 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 21 01:01:19.465220 kernel: audit: type=1130 audit(1768957279.461:852): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-10.0.5.74:22-4.153.228.146:44466 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 01:01:19.984000 audit[5288]: USER_ACCT pid=5288 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:19.991296 sshd[5288]: Accepted publickey for core from 4.153.228.146 port 44466 ssh2: RSA SHA256:31xSJzLknPW0WXQ1/0HnwH4E7nhabzVasrxSAbB59go Jan 21 01:01:19.991703 kernel: audit: type=1101 audit(1768957279.984:853): pid=5288 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:19.995765 kernel: audit: type=1103 audit(1768957279.990:854): pid=5288 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:19.990000 audit[5288]: CRED_ACQ pid=5288 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:19.996646 sshd-session[5288]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 21 01:01:20.000921 kernel: audit: type=1006 audit(1768957279.994:855): pid=5288 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=22 res=1 Jan 21 01:01:19.994000 audit[5288]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffd72890f90 a2=3 a3=0 items=0 ppid=1 pid=5288 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=22 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 01:01:20.006373 kernel: audit: type=1300 audit(1768957279.994:855): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffd72890f90 a2=3 a3=0 items=0 ppid=1 pid=5288 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=22 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 01:01:20.006447 kernel: audit: type=1327 audit(1768957279.994:855): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 21 01:01:19.994000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 21 01:01:20.007279 systemd-logind[1653]: New session 22 of user core. Jan 21 01:01:20.012884 systemd[1]: Started session-22.scope - Session 22 of User core. Jan 21 01:01:20.014000 audit[5288]: USER_START pid=5288 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:20.022197 kernel: audit: type=1105 audit(1768957280.014:856): pid=5288 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:20.023000 audit[5292]: CRED_ACQ pid=5292 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:20.029717 kernel: audit: type=1103 audit(1768957280.023:857): pid=5292 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:20.212890 kubelet[2901]: E0121 01:01:20.212595 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-54cbb8896b-89fcv" podUID="557eba89-d604-4304-afae-f0e623ef8722" Jan 21 01:01:20.354873 sshd[5292]: Connection closed by 4.153.228.146 port 44466 Jan 21 01:01:20.355828 sshd-session[5288]: pam_unix(sshd:session): session closed for user core Jan 21 01:01:20.356000 audit[5288]: USER_END pid=5288 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:20.362530 systemd[1]: sshd@20-10.0.5.74:22-4.153.228.146:44466.service: Deactivated successfully. Jan 21 01:01:20.363724 kernel: audit: type=1106 audit(1768957280.356:858): pid=5288 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:20.356000 audit[5288]: CRED_DISP pid=5288 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:20.368618 systemd[1]: session-22.scope: Deactivated successfully. Jan 21 01:01:20.369766 kernel: audit: type=1104 audit(1768957280.356:859): pid=5288 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:20.359000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-10.0.5.74:22-4.153.228.146:44466 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 01:01:20.371694 systemd-logind[1653]: Session 22 logged out. Waiting for processes to exit. Jan 21 01:01:20.374211 systemd-logind[1653]: Removed session 22. Jan 21 01:01:23.211785 kubelet[2901]: E0121 01:01:23.211728 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-bc7877fd9-wf2hg" podUID="59ce7fac-1a6e-4ec4-b99e-063ed3e3444c" Jan 21 01:01:24.212368 kubelet[2901]: E0121 01:01:24.212311 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-54cbb8896b-hxx9z" podUID="22543a80-9d55-4110-b0da-aa35bd7688e7" Jan 21 01:01:24.212996 kubelet[2901]: E0121 01:01:24.212832 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8lkb9" podUID="851d5829-334a-4f46-97de-87be973a0b77" Jan 21 01:01:25.467715 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 21 01:01:25.467814 kernel: audit: type=1130 audit(1768957285.461:861): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-10.0.5.74:22-4.153.228.146:57558 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 01:01:25.461000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-10.0.5.74:22-4.153.228.146:57558 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 01:01:25.463277 systemd[1]: Started sshd@21-10.0.5.74:22-4.153.228.146:57558.service - OpenSSH per-connection server daemon (4.153.228.146:57558). Jan 21 01:01:25.976000 audit[5306]: USER_ACCT pid=5306 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:25.983537 sshd[5306]: Accepted publickey for core from 4.153.228.146 port 57558 ssh2: RSA SHA256:31xSJzLknPW0WXQ1/0HnwH4E7nhabzVasrxSAbB59go Jan 21 01:01:25.983820 kernel: audit: type=1101 audit(1768957285.976:862): pid=5306 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:25.983000 audit[5306]: CRED_ACQ pid=5306 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:25.989079 sshd-session[5306]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 21 01:01:25.989847 kernel: audit: type=1103 audit(1768957285.983:863): pid=5306 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:25.992856 kernel: audit: type=1006 audit(1768957285.983:864): pid=5306 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=23 res=1 Jan 21 01:01:25.983000 audit[5306]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffed6173ec0 a2=3 a3=0 items=0 ppid=1 pid=5306 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=23 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 01:01:25.997778 kernel: audit: type=1300 audit(1768957285.983:864): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffed6173ec0 a2=3 a3=0 items=0 ppid=1 pid=5306 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=23 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 01:01:25.983000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 21 01:01:25.999735 kernel: audit: type=1327 audit(1768957285.983:864): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 21 01:01:26.004737 systemd-logind[1653]: New session 23 of user core. Jan 21 01:01:26.011001 systemd[1]: Started session-23.scope - Session 23 of User core. Jan 21 01:01:26.019776 kernel: audit: type=1105 audit(1768957286.012:865): pid=5306 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:26.012000 audit[5306]: USER_START pid=5306 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:26.018000 audit[5310]: CRED_ACQ pid=5310 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:26.023733 kernel: audit: type=1103 audit(1768957286.018:866): pid=5310 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:26.212309 kubelet[2901]: E0121 01:01:26.212108 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-95c8v" podUID="be64252f-80c4-46f6-a3d4-52a6471b1a63" Jan 21 01:01:26.354628 sshd[5310]: Connection closed by 4.153.228.146 port 57558 Jan 21 01:01:26.354344 sshd-session[5306]: pam_unix(sshd:session): session closed for user core Jan 21 01:01:26.354000 audit[5306]: USER_END pid=5306 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:26.362824 kernel: audit: type=1106 audit(1768957286.354:867): pid=5306 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:26.363547 systemd[1]: sshd@21-10.0.5.74:22-4.153.228.146:57558.service: Deactivated successfully. Jan 21 01:01:26.366253 systemd[1]: session-23.scope: Deactivated successfully. Jan 21 01:01:26.369344 systemd-logind[1653]: Session 23 logged out. Waiting for processes to exit. Jan 21 01:01:26.354000 audit[5306]: CRED_DISP pid=5306 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:26.373703 kernel: audit: type=1104 audit(1768957286.354:868): pid=5306 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:26.374409 systemd-logind[1653]: Removed session 23. Jan 21 01:01:26.362000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-10.0.5.74:22-4.153.228.146:57558 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 01:01:28.211997 kubelet[2901]: E0121 01:01:28.211877 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-64874dbd99-cd26r" podUID="9f88865d-a485-4a8a-b0f8-118c9593a73c" Jan 21 01:01:31.455000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-10.0.5.74:22-4.153.228.146:57570 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 01:01:31.456485 systemd[1]: Started sshd@22-10.0.5.74:22-4.153.228.146:57570.service - OpenSSH per-connection server daemon (4.153.228.146:57570). Jan 21 01:01:31.457502 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 21 01:01:31.457540 kernel: audit: type=1130 audit(1768957291.455:870): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-10.0.5.74:22-4.153.228.146:57570 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 01:01:31.972000 audit[5322]: USER_ACCT pid=5322 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:31.973563 sshd[5322]: Accepted publickey for core from 4.153.228.146 port 57570 ssh2: RSA SHA256:31xSJzLknPW0WXQ1/0HnwH4E7nhabzVasrxSAbB59go Jan 21 01:01:31.976594 sshd-session[5322]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 21 01:01:31.978849 kernel: audit: type=1101 audit(1768957291.972:871): pid=5322 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:31.978920 kernel: audit: type=1103 audit(1768957291.975:872): pid=5322 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:31.975000 audit[5322]: CRED_ACQ pid=5322 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:31.983872 kernel: audit: type=1006 audit(1768957291.975:873): pid=5322 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=24 res=1 Jan 21 01:01:31.975000 audit[5322]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffec22caf40 a2=3 a3=0 items=0 ppid=1 pid=5322 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=24 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 01:01:31.988031 kernel: audit: type=1300 audit(1768957291.975:873): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffec22caf40 a2=3 a3=0 items=0 ppid=1 pid=5322 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=24 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 01:01:31.987193 systemd-logind[1653]: New session 24 of user core. Jan 21 01:01:31.975000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 21 01:01:31.993335 kernel: audit: type=1327 audit(1768957291.975:873): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 21 01:01:31.994216 systemd[1]: Started session-24.scope - Session 24 of User core. Jan 21 01:01:31.997000 audit[5322]: USER_START pid=5322 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:31.999000 audit[5328]: CRED_ACQ pid=5328 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:32.004970 kernel: audit: type=1105 audit(1768957291.997:874): pid=5322 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:32.005018 kernel: audit: type=1103 audit(1768957291.999:875): pid=5328 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:32.211319 kubelet[2901]: E0121 01:01:32.211284 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-54cbb8896b-89fcv" podUID="557eba89-d604-4304-afae-f0e623ef8722" Jan 21 01:01:32.338461 sshd[5328]: Connection closed by 4.153.228.146 port 57570 Jan 21 01:01:32.342247 sshd-session[5322]: pam_unix(sshd:session): session closed for user core Jan 21 01:01:32.343000 audit[5322]: USER_END pid=5322 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:32.348397 systemd[1]: sshd@22-10.0.5.74:22-4.153.228.146:57570.service: Deactivated successfully. Jan 21 01:01:32.349707 kernel: audit: type=1106 audit(1768957292.343:876): pid=5322 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:32.343000 audit[5322]: CRED_DISP pid=5322 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:32.351904 systemd[1]: session-24.scope: Deactivated successfully. Jan 21 01:01:32.353374 systemd-logind[1653]: Session 24 logged out. Waiting for processes to exit. Jan 21 01:01:32.354753 kernel: audit: type=1104 audit(1768957292.343:877): pid=5322 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:32.347000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-10.0.5.74:22-4.153.228.146:57570 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 01:01:32.359666 systemd-logind[1653]: Removed session 24. Jan 21 01:01:37.214589 kubelet[2901]: E0121 01:01:37.214550 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-bc7877fd9-wf2hg" podUID="59ce7fac-1a6e-4ec4-b99e-063ed3e3444c" Jan 21 01:01:37.446753 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 21 01:01:37.446832 kernel: audit: type=1130 audit(1768957297.445:879): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-10.0.5.74:22-4.153.228.146:57272 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 01:01:37.445000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-10.0.5.74:22-4.153.228.146:57272 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 01:01:37.445901 systemd[1]: Started sshd@23-10.0.5.74:22-4.153.228.146:57272.service - OpenSSH per-connection server daemon (4.153.228.146:57272). Jan 21 01:01:37.962000 audit[5364]: USER_ACCT pid=5364 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:37.964662 sshd[5364]: Accepted publickey for core from 4.153.228.146 port 57272 ssh2: RSA SHA256:31xSJzLknPW0WXQ1/0HnwH4E7nhabzVasrxSAbB59go Jan 21 01:01:37.966349 sshd-session[5364]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 21 01:01:37.964000 audit[5364]: CRED_ACQ pid=5364 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:37.969717 kernel: audit: type=1101 audit(1768957297.962:880): pid=5364 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:37.969768 kernel: audit: type=1103 audit(1768957297.964:881): pid=5364 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:37.973274 kernel: audit: type=1006 audit(1768957297.964:882): pid=5364 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=25 res=1 Jan 21 01:01:37.974715 kernel: audit: type=1300 audit(1768957297.964:882): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffd2e75b760 a2=3 a3=0 items=0 ppid=1 pid=5364 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=25 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 01:01:37.964000 audit[5364]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffd2e75b760 a2=3 a3=0 items=0 ppid=1 pid=5364 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=25 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 01:01:37.976167 systemd-logind[1653]: New session 25 of user core. Jan 21 01:01:37.964000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 21 01:01:37.980488 kernel: audit: type=1327 audit(1768957297.964:882): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 21 01:01:37.980996 systemd[1]: Started session-25.scope - Session 25 of User core. Jan 21 01:01:37.983000 audit[5364]: USER_START pid=5364 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:37.989728 kernel: audit: type=1105 audit(1768957297.983:883): pid=5364 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:37.989799 kernel: audit: type=1103 audit(1768957297.985:884): pid=5368 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:37.985000 audit[5368]: CRED_ACQ pid=5368 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:38.211003 kubelet[2901]: E0121 01:01:38.210944 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-54cbb8896b-hxx9z" podUID="22543a80-9d55-4110-b0da-aa35bd7688e7" Jan 21 01:01:38.315087 sshd[5368]: Connection closed by 4.153.228.146 port 57272 Jan 21 01:01:38.315799 sshd-session[5364]: pam_unix(sshd:session): session closed for user core Jan 21 01:01:38.316000 audit[5364]: USER_END pid=5364 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:38.322771 kernel: audit: type=1106 audit(1768957298.316:885): pid=5364 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:38.322720 systemd[1]: sshd@23-10.0.5.74:22-4.153.228.146:57272.service: Deactivated successfully. Jan 21 01:01:38.316000 audit[5364]: CRED_DISP pid=5364 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:38.324305 systemd[1]: session-25.scope: Deactivated successfully. Jan 21 01:01:38.327606 systemd-logind[1653]: Session 25 logged out. Waiting for processes to exit. Jan 21 01:01:38.328210 kernel: audit: type=1104 audit(1768957298.316:886): pid=5364 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 21 01:01:38.322000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-10.0.5.74:22-4.153.228.146:57272 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 21 01:01:38.329179 systemd-logind[1653]: Removed session 25. Jan 21 01:01:39.214295 kubelet[2901]: E0121 01:01:39.214260 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8lkb9" podUID="851d5829-334a-4f46-97de-87be973a0b77" Jan 21 01:01:41.216203 kubelet[2901]: E0121 01:01:41.215907 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-95c8v" podUID="be64252f-80c4-46f6-a3d4-52a6471b1a63" Jan 21 01:01:41.217144 kubelet[2901]: E0121 01:01:41.216797 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-64874dbd99-cd26r" podUID="9f88865d-a485-4a8a-b0f8-118c9593a73c" Jan 21 01:01:46.211153 kubelet[2901]: E0121 01:01:46.211093 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-54cbb8896b-89fcv" podUID="557eba89-d604-4304-afae-f0e623ef8722" Jan 21 01:01:50.211632 kubelet[2901]: E0121 01:01:50.211553 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-bc7877fd9-wf2hg" podUID="59ce7fac-1a6e-4ec4-b99e-063ed3e3444c" Jan 21 01:01:51.213748 kubelet[2901]: E0121 01:01:51.212526 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-54cbb8896b-hxx9z" podUID="22543a80-9d55-4110-b0da-aa35bd7688e7" Jan 21 01:01:54.212165 kubelet[2901]: E0121 01:01:54.212121 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-95c8v" podUID="be64252f-80c4-46f6-a3d4-52a6471b1a63" Jan 21 01:01:54.213241 containerd[1680]: time="2026-01-21T01:01:54.213132176Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 21 01:01:54.214196 kubelet[2901]: E0121 01:01:54.214165 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8lkb9" podUID="851d5829-334a-4f46-97de-87be973a0b77" Jan 21 01:01:54.557759 containerd[1680]: time="2026-01-21T01:01:54.557419173Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 21 01:01:54.560710 containerd[1680]: time="2026-01-21T01:01:54.559200985Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 21 01:01:54.560905 containerd[1680]: time="2026-01-21T01:01:54.559259880Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 21 01:01:54.561426 kubelet[2901]: E0121 01:01:54.561084 2901 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 21 01:01:54.561426 kubelet[2901]: E0121 01:01:54.561128 2901 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 21 01:01:54.561426 kubelet[2901]: E0121 01:01:54.561227 2901 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:8f6a270e2c9b4b408eb629752f61ddd6,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sl949,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-64874dbd99-cd26r_calico-system(9f88865d-a485-4a8a-b0f8-118c9593a73c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 21 01:01:54.563303 containerd[1680]: time="2026-01-21T01:01:54.563140212Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 21 01:01:54.909379 containerd[1680]: time="2026-01-21T01:01:54.909281979Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 21 01:01:54.911420 containerd[1680]: time="2026-01-21T01:01:54.911326106Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 21 01:01:54.911420 containerd[1680]: time="2026-01-21T01:01:54.911388287Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 21 01:01:54.911566 kubelet[2901]: E0121 01:01:54.911524 2901 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 21 01:01:54.911623 kubelet[2901]: E0121 01:01:54.911568 2901 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 21 01:01:54.913758 kubelet[2901]: E0121 01:01:54.911670 2901 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sl949,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-64874dbd99-cd26r_calico-system(9f88865d-a485-4a8a-b0f8-118c9593a73c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 21 01:01:54.915153 kubelet[2901]: E0121 01:01:54.915105 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-64874dbd99-cd26r" podUID="9f88865d-a485-4a8a-b0f8-118c9593a73c" Jan 21 01:01:56.999947 update_engine[1654]: I20260121 01:01:56.999542 1654 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Jan 21 01:01:56.999947 update_engine[1654]: I20260121 01:01:56.999604 1654 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Jan 21 01:01:57.001769 update_engine[1654]: I20260121 01:01:57.001205 1654 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Jan 21 01:01:57.001769 update_engine[1654]: I20260121 01:01:57.001565 1654 omaha_request_params.cc:62] Current group set to alpha Jan 21 01:01:57.003110 update_engine[1654]: I20260121 01:01:57.003049 1654 update_attempter.cc:499] Already updated boot flags. Skipping. Jan 21 01:01:57.003110 update_engine[1654]: I20260121 01:01:57.003090 1654 update_attempter.cc:643] Scheduling an action processor start. Jan 21 01:01:57.003250 update_engine[1654]: I20260121 01:01:57.003118 1654 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Jan 21 01:01:57.013151 update_engine[1654]: I20260121 01:01:57.012537 1654 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Jan 21 01:01:57.013151 update_engine[1654]: I20260121 01:01:57.012646 1654 omaha_request_action.cc:271] Posting an Omaha request to disabled Jan 21 01:01:57.013151 update_engine[1654]: I20260121 01:01:57.012655 1654 omaha_request_action.cc:272] Request: Jan 21 01:01:57.013151 update_engine[1654]: Jan 21 01:01:57.013151 update_engine[1654]: Jan 21 01:01:57.013151 update_engine[1654]: Jan 21 01:01:57.013151 update_engine[1654]: Jan 21 01:01:57.013151 update_engine[1654]: Jan 21 01:01:57.013151 update_engine[1654]: Jan 21 01:01:57.013151 update_engine[1654]: Jan 21 01:01:57.013151 update_engine[1654]: Jan 21 01:01:57.013151 update_engine[1654]: I20260121 01:01:57.012660 1654 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jan 21 01:01:57.014319 locksmithd[1710]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Jan 21 01:01:57.016660 update_engine[1654]: I20260121 01:01:57.016618 1654 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jan 21 01:01:57.017981 update_engine[1654]: I20260121 01:01:57.017951 1654 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jan 21 01:01:57.024573 update_engine[1654]: E20260121 01:01:57.024525 1654 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled (Domain name not found) Jan 21 01:01:57.024702 update_engine[1654]: I20260121 01:01:57.024614 1654 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Jan 21 01:01:59.213287 containerd[1680]: time="2026-01-21T01:01:59.212749529Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 21 01:01:59.544992 containerd[1680]: time="2026-01-21T01:01:59.544873855Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 21 01:01:59.546860 containerd[1680]: time="2026-01-21T01:01:59.546747089Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 21 01:01:59.546860 containerd[1680]: time="2026-01-21T01:01:59.546837259Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 21 01:01:59.547356 kubelet[2901]: E0121 01:01:59.547108 2901 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 21 01:01:59.547356 kubelet[2901]: E0121 01:01:59.547149 2901 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 21 01:01:59.547356 kubelet[2901]: E0121 01:01:59.547285 2901 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9hxfm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-54cbb8896b-89fcv_calico-apiserver(557eba89-d604-4304-afae-f0e623ef8722): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 21 01:01:59.548490 kubelet[2901]: E0121 01:01:59.548452 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-54cbb8896b-89fcv" podUID="557eba89-d604-4304-afae-f0e623ef8722" Jan 21 01:02:02.211781 containerd[1680]: time="2026-01-21T01:02:02.211727809Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 21 01:02:02.545462 containerd[1680]: time="2026-01-21T01:02:02.545256905Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 21 01:02:02.547081 containerd[1680]: time="2026-01-21T01:02:02.546982351Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 21 01:02:02.547081 containerd[1680]: time="2026-01-21T01:02:02.547001773Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 21 01:02:02.547225 kubelet[2901]: E0121 01:02:02.547178 2901 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 21 01:02:02.547536 kubelet[2901]: E0121 01:02:02.547224 2901 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 21 01:02:02.547536 kubelet[2901]: E0121 01:02:02.547334 2901 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9dbdq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-54cbb8896b-hxx9z_calico-apiserver(22543a80-9d55-4110-b0da-aa35bd7688e7): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 21 01:02:02.548511 kubelet[2901]: E0121 01:02:02.548478 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-54cbb8896b-hxx9z" podUID="22543a80-9d55-4110-b0da-aa35bd7688e7" Jan 21 01:02:03.212143 kubelet[2901]: E0121 01:02:03.212092 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-bc7877fd9-wf2hg" podUID="59ce7fac-1a6e-4ec4-b99e-063ed3e3444c" Jan 21 01:02:05.212406 containerd[1680]: time="2026-01-21T01:02:05.212154828Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 21 01:02:05.554249 containerd[1680]: time="2026-01-21T01:02:05.554037478Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 21 01:02:05.555763 containerd[1680]: time="2026-01-21T01:02:05.555657272Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 21 01:02:05.555763 containerd[1680]: time="2026-01-21T01:02:05.555710161Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 21 01:02:05.556030 kubelet[2901]: E0121 01:02:05.555994 2901 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 21 01:02:05.556525 kubelet[2901]: E0121 01:02:05.556319 2901 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 21 01:02:05.556525 kubelet[2901]: E0121 01:02:05.556473 2901 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-grgbg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-95c8v_calico-system(be64252f-80c4-46f6-a3d4-52a6471b1a63): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 21 01:02:05.557690 kubelet[2901]: E0121 01:02:05.557647 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-95c8v" podUID="be64252f-80c4-46f6-a3d4-52a6471b1a63" Jan 21 01:02:06.994268 update_engine[1654]: I20260121 01:02:06.994134 1654 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jan 21 01:02:06.994268 update_engine[1654]: I20260121 01:02:06.994287 1654 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jan 21 01:02:06.994850 update_engine[1654]: I20260121 01:02:06.994823 1654 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jan 21 01:02:07.002266 update_engine[1654]: E20260121 01:02:07.002179 1654 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled (Domain name not found) Jan 21 01:02:07.002437 update_engine[1654]: I20260121 01:02:07.002306 1654 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Jan 21 01:02:07.694340 kubelet[2901]: E0121 01:02:07.694228 2901 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.5.74:42584->10.0.5.62:2379: read: connection timed out" Jan 21 01:02:07.701613 systemd[1]: cri-containerd-7eb23e2cf70d6a151452bd2ffd43b60b4d92ffe3bac811ce411db8da0cebd960.scope: Deactivated successfully. Jan 21 01:02:07.701940 systemd[1]: cri-containerd-7eb23e2cf70d6a151452bd2ffd43b60b4d92ffe3bac811ce411db8da0cebd960.scope: Consumed 2.107s CPU time, 22.5M memory peak, 148K read from disk. Jan 21 01:02:07.705745 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 21 01:02:07.705839 kernel: audit: type=1334 audit(1768957327.700:888): prog-id=256 op=LOAD Jan 21 01:02:07.700000 audit: BPF prog-id=256 op=LOAD Jan 21 01:02:07.705921 containerd[1680]: time="2026-01-21T01:02:07.705203553Z" level=info msg="received container exit event container_id:\"7eb23e2cf70d6a151452bd2ffd43b60b4d92ffe3bac811ce411db8da0cebd960\" id:\"7eb23e2cf70d6a151452bd2ffd43b60b4d92ffe3bac811ce411db8da0cebd960\" pid:2769 exit_status:1 exited_at:{seconds:1768957327 nanos:704471941}" Jan 21 01:02:07.700000 audit: BPF prog-id=93 op=UNLOAD Jan 21 01:02:07.711201 kernel: audit: type=1334 audit(1768957327.700:889): prog-id=93 op=UNLOAD Jan 21 01:02:07.711273 kernel: audit: type=1334 audit(1768957327.705:890): prog-id=108 op=UNLOAD Jan 21 01:02:07.705000 audit: BPF prog-id=108 op=UNLOAD Jan 21 01:02:07.712949 kernel: audit: type=1334 audit(1768957327.705:891): prog-id=112 op=UNLOAD Jan 21 01:02:07.705000 audit: BPF prog-id=112 op=UNLOAD Jan 21 01:02:07.729812 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-7eb23e2cf70d6a151452bd2ffd43b60b4d92ffe3bac811ce411db8da0cebd960-rootfs.mount: Deactivated successfully. Jan 21 01:02:07.921002 systemd[1]: cri-containerd-df42269d3da3a55b5b1f41fd48f6fee96c71386faf9ebbf8962d89b92782f102.scope: Deactivated successfully. Jan 21 01:02:07.921272 systemd[1]: cri-containerd-df42269d3da3a55b5b1f41fd48f6fee96c71386faf9ebbf8962d89b92782f102.scope: Consumed 3.054s CPU time, 58.9M memory peak, 192K read from disk. Jan 21 01:02:07.920000 audit: BPF prog-id=257 op=LOAD Jan 21 01:02:07.923758 kernel: audit: type=1334 audit(1768957327.920:892): prog-id=257 op=LOAD Jan 21 01:02:07.920000 audit: BPF prog-id=88 op=UNLOAD Jan 21 01:02:07.925695 kernel: audit: type=1334 audit(1768957327.920:893): prog-id=88 op=UNLOAD Jan 21 01:02:07.923000 audit: BPF prog-id=103 op=UNLOAD Jan 21 01:02:07.927826 containerd[1680]: time="2026-01-21T01:02:07.926813725Z" level=info msg="received container exit event container_id:\"df42269d3da3a55b5b1f41fd48f6fee96c71386faf9ebbf8962d89b92782f102\" id:\"df42269d3da3a55b5b1f41fd48f6fee96c71386faf9ebbf8962d89b92782f102\" pid:2749 exit_status:1 exited_at:{seconds:1768957327 nanos:925329140}" Jan 21 01:02:07.923000 audit: BPF prog-id=107 op=UNLOAD Jan 21 01:02:07.929303 kernel: audit: type=1334 audit(1768957327.923:894): prog-id=103 op=UNLOAD Jan 21 01:02:07.929349 kernel: audit: type=1334 audit(1768957327.923:895): prog-id=107 op=UNLOAD Jan 21 01:02:07.950611 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-df42269d3da3a55b5b1f41fd48f6fee96c71386faf9ebbf8962d89b92782f102-rootfs.mount: Deactivated successfully. Jan 21 01:02:08.258767 systemd[1]: cri-containerd-0ef9527e05f57cc37c51ccd4c63332f20a46a6f96b7a4fc1d2f35ca91a6882ca.scope: Deactivated successfully. Jan 21 01:02:08.259778 systemd[1]: cri-containerd-0ef9527e05f57cc37c51ccd4c63332f20a46a6f96b7a4fc1d2f35ca91a6882ca.scope: Consumed 23.182s CPU time, 105.5M memory peak. Jan 21 01:02:08.262979 containerd[1680]: time="2026-01-21T01:02:08.262945811Z" level=info msg="received container exit event container_id:\"0ef9527e05f57cc37c51ccd4c63332f20a46a6f96b7a4fc1d2f35ca91a6882ca\" id:\"0ef9527e05f57cc37c51ccd4c63332f20a46a6f96b7a4fc1d2f35ca91a6882ca\" pid:3224 exit_status:1 exited_at:{seconds:1768957328 nanos:259982525}" Jan 21 01:02:08.268073 kernel: audit: type=1334 audit(1768957328.263:896): prog-id=146 op=UNLOAD Jan 21 01:02:08.268171 kernel: audit: type=1334 audit(1768957328.263:897): prog-id=150 op=UNLOAD Jan 21 01:02:08.263000 audit: BPF prog-id=146 op=UNLOAD Jan 21 01:02:08.263000 audit: BPF prog-id=150 op=UNLOAD Jan 21 01:02:08.287743 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-0ef9527e05f57cc37c51ccd4c63332f20a46a6f96b7a4fc1d2f35ca91a6882ca-rootfs.mount: Deactivated successfully. Jan 21 01:02:08.694902 kubelet[2901]: I0121 01:02:08.694343 2901 scope.go:117] "RemoveContainer" containerID="0ef9527e05f57cc37c51ccd4c63332f20a46a6f96b7a4fc1d2f35ca91a6882ca" Jan 21 01:02:08.697028 kubelet[2901]: I0121 01:02:08.696883 2901 scope.go:117] "RemoveContainer" containerID="df42269d3da3a55b5b1f41fd48f6fee96c71386faf9ebbf8962d89b92782f102" Jan 21 01:02:08.697211 containerd[1680]: time="2026-01-21T01:02:08.697176846Z" level=info msg="CreateContainer within sandbox \"450c34364066109978fe09fa992ad6c2c4ecf711d9656e7efca042de0425d296\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Jan 21 01:02:08.700404 kubelet[2901]: I0121 01:02:08.700380 2901 scope.go:117] "RemoveContainer" containerID="7eb23e2cf70d6a151452bd2ffd43b60b4d92ffe3bac811ce411db8da0cebd960" Jan 21 01:02:08.700714 containerd[1680]: time="2026-01-21T01:02:08.700644350Z" level=info msg="CreateContainer within sandbox \"8981a065df3ff0c296c8061cf4d99c5946d106fb652bb66b69e710d26864c08f\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Jan 21 01:02:08.703676 containerd[1680]: time="2026-01-21T01:02:08.703651060Z" level=info msg="CreateContainer within sandbox \"2e9492921b1836aa6f8752b0782eb00f351006e704223894420a9608a3501416\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Jan 21 01:02:08.718638 containerd[1680]: time="2026-01-21T01:02:08.718602628Z" level=info msg="Container 9e5aa945864f9c065a3065281a0c5f60215f09441d0e1b6e8b21471b9895115d: CDI devices from CRI Config.CDIDevices: []" Jan 21 01:02:08.729334 containerd[1680]: time="2026-01-21T01:02:08.729077682Z" level=info msg="Container c5b05b7157fb68a39c586f3a1b399ab17852a73795b8758a4167beb2eb3561a9: CDI devices from CRI Config.CDIDevices: []" Jan 21 01:02:08.738622 containerd[1680]: time="2026-01-21T01:02:08.738579804Z" level=info msg="CreateContainer within sandbox \"8981a065df3ff0c296c8061cf4d99c5946d106fb652bb66b69e710d26864c08f\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"9e5aa945864f9c065a3065281a0c5f60215f09441d0e1b6e8b21471b9895115d\"" Jan 21 01:02:08.743707 containerd[1680]: time="2026-01-21T01:02:08.739373303Z" level=info msg="StartContainer for \"9e5aa945864f9c065a3065281a0c5f60215f09441d0e1b6e8b21471b9895115d\"" Jan 21 01:02:08.748263 containerd[1680]: time="2026-01-21T01:02:08.745552814Z" level=info msg="connecting to shim 9e5aa945864f9c065a3065281a0c5f60215f09441d0e1b6e8b21471b9895115d" address="unix:///run/containerd/s/8d8d74b139111924092190e0a48d8853d1c95fbc706f7151d9830eb10c18ac48" protocol=ttrpc version=3 Jan 21 01:02:08.748537 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1527319978.mount: Deactivated successfully. Jan 21 01:02:08.750705 containerd[1680]: time="2026-01-21T01:02:08.750537877Z" level=info msg="CreateContainer within sandbox \"450c34364066109978fe09fa992ad6c2c4ecf711d9656e7efca042de0425d296\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"c5b05b7157fb68a39c586f3a1b399ab17852a73795b8758a4167beb2eb3561a9\"" Jan 21 01:02:08.753707 containerd[1680]: time="2026-01-21T01:02:08.751975808Z" level=info msg="StartContainer for \"c5b05b7157fb68a39c586f3a1b399ab17852a73795b8758a4167beb2eb3561a9\"" Jan 21 01:02:08.753707 containerd[1680]: time="2026-01-21T01:02:08.752818446Z" level=info msg="connecting to shim c5b05b7157fb68a39c586f3a1b399ab17852a73795b8758a4167beb2eb3561a9" address="unix:///run/containerd/s/3f6435668d8f6faeb8dddb52e46ca1cadaaecc6349796259469e8bb9996dd34d" protocol=ttrpc version=3 Jan 21 01:02:08.755699 containerd[1680]: time="2026-01-21T01:02:08.754009219Z" level=info msg="Container 5ccb840188aa22b12f0e1a1cc812f4a95a282c88807aad59b764430ef2fe765e: CDI devices from CRI Config.CDIDevices: []" Jan 21 01:02:08.775525 containerd[1680]: time="2026-01-21T01:02:08.774601168Z" level=info msg="CreateContainer within sandbox \"2e9492921b1836aa6f8752b0782eb00f351006e704223894420a9608a3501416\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"5ccb840188aa22b12f0e1a1cc812f4a95a282c88807aad59b764430ef2fe765e\"" Jan 21 01:02:08.776277 containerd[1680]: time="2026-01-21T01:02:08.776253702Z" level=info msg="StartContainer for \"5ccb840188aa22b12f0e1a1cc812f4a95a282c88807aad59b764430ef2fe765e\"" Jan 21 01:02:08.776922 systemd[1]: Started cri-containerd-9e5aa945864f9c065a3065281a0c5f60215f09441d0e1b6e8b21471b9895115d.scope - libcontainer container 9e5aa945864f9c065a3065281a0c5f60215f09441d0e1b6e8b21471b9895115d. Jan 21 01:02:08.781532 containerd[1680]: time="2026-01-21T01:02:08.781015294Z" level=info msg="connecting to shim 5ccb840188aa22b12f0e1a1cc812f4a95a282c88807aad59b764430ef2fe765e" address="unix:///run/containerd/s/f016eb9e19f88219bbe73ae3611c579f01419a2f69cf4ae1a48acb5c18763823" protocol=ttrpc version=3 Jan 21 01:02:08.786883 systemd[1]: Started cri-containerd-c5b05b7157fb68a39c586f3a1b399ab17852a73795b8758a4167beb2eb3561a9.scope - libcontainer container c5b05b7157fb68a39c586f3a1b399ab17852a73795b8758a4167beb2eb3561a9. Jan 21 01:02:08.794000 audit: BPF prog-id=258 op=LOAD Jan 21 01:02:08.795000 audit: BPF prog-id=259 op=LOAD Jan 21 01:02:08.795000 audit[5461]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b0238 a2=98 a3=0 items=0 ppid=2613 pid=5461 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 01:02:08.795000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3965356161393435383634663963303635613330363532383161306335 Jan 21 01:02:08.795000 audit: BPF prog-id=259 op=UNLOAD Jan 21 01:02:08.795000 audit[5461]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2613 pid=5461 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 01:02:08.795000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3965356161393435383634663963303635613330363532383161306335 Jan 21 01:02:08.796000 audit: BPF prog-id=260 op=LOAD Jan 21 01:02:08.796000 audit[5461]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b0488 a2=98 a3=0 items=0 ppid=2613 pid=5461 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 01:02:08.796000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3965356161393435383634663963303635613330363532383161306335 Jan 21 01:02:08.796000 audit: BPF prog-id=261 op=LOAD Jan 21 01:02:08.796000 audit[5461]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c0001b0218 a2=98 a3=0 items=0 ppid=2613 pid=5461 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 01:02:08.796000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3965356161393435383634663963303635613330363532383161306335 Jan 21 01:02:08.796000 audit: BPF prog-id=261 op=UNLOAD Jan 21 01:02:08.796000 audit[5461]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2613 pid=5461 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 01:02:08.796000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3965356161393435383634663963303635613330363532383161306335 Jan 21 01:02:08.796000 audit: BPF prog-id=260 op=UNLOAD Jan 21 01:02:08.796000 audit[5461]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2613 pid=5461 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 01:02:08.796000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3965356161393435383634663963303635613330363532383161306335 Jan 21 01:02:08.796000 audit: BPF prog-id=262 op=LOAD Jan 21 01:02:08.796000 audit[5461]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b06e8 a2=98 a3=0 items=0 ppid=2613 pid=5461 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 01:02:08.796000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3965356161393435383634663963303635613330363532383161306335 Jan 21 01:02:08.805862 systemd[1]: Started cri-containerd-5ccb840188aa22b12f0e1a1cc812f4a95a282c88807aad59b764430ef2fe765e.scope - libcontainer container 5ccb840188aa22b12f0e1a1cc812f4a95a282c88807aad59b764430ef2fe765e. Jan 21 01:02:08.811000 audit: BPF prog-id=263 op=LOAD Jan 21 01:02:08.812000 audit: BPF prog-id=264 op=LOAD Jan 21 01:02:08.812000 audit[5462]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=2954 pid=5462 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 01:02:08.812000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6335623035623731353766623638613339633538366633613162333939 Jan 21 01:02:08.812000 audit: BPF prog-id=264 op=UNLOAD Jan 21 01:02:08.812000 audit[5462]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2954 pid=5462 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 01:02:08.812000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6335623035623731353766623638613339633538366633613162333939 Jan 21 01:02:08.813000 audit: BPF prog-id=265 op=LOAD Jan 21 01:02:08.813000 audit[5462]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=2954 pid=5462 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 01:02:08.813000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6335623035623731353766623638613339633538366633613162333939 Jan 21 01:02:08.813000 audit: BPF prog-id=266 op=LOAD Jan 21 01:02:08.813000 audit[5462]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=2954 pid=5462 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 01:02:08.813000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6335623035623731353766623638613339633538366633613162333939 Jan 21 01:02:08.813000 audit: BPF prog-id=266 op=UNLOAD Jan 21 01:02:08.813000 audit[5462]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2954 pid=5462 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 01:02:08.813000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6335623035623731353766623638613339633538366633613162333939 Jan 21 01:02:08.813000 audit: BPF prog-id=265 op=UNLOAD Jan 21 01:02:08.813000 audit[5462]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2954 pid=5462 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 01:02:08.813000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6335623035623731353766623638613339633538366633613162333939 Jan 21 01:02:08.814000 audit: BPF prog-id=267 op=LOAD Jan 21 01:02:08.814000 audit[5462]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=2954 pid=5462 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 01:02:08.814000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6335623035623731353766623638613339633538366633613162333939 Jan 21 01:02:08.825000 audit: BPF prog-id=268 op=LOAD Jan 21 01:02:08.826000 audit: BPF prog-id=269 op=LOAD Jan 21 01:02:08.826000 audit[5486]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=2658 pid=5486 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 01:02:08.826000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3563636238343031383861613232623132663065316131636338313266 Jan 21 01:02:08.826000 audit: BPF prog-id=269 op=UNLOAD Jan 21 01:02:08.826000 audit[5486]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2658 pid=5486 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 01:02:08.826000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3563636238343031383861613232623132663065316131636338313266 Jan 21 01:02:08.826000 audit: BPF prog-id=270 op=LOAD Jan 21 01:02:08.826000 audit[5486]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=2658 pid=5486 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 01:02:08.826000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3563636238343031383861613232623132663065316131636338313266 Jan 21 01:02:08.826000 audit: BPF prog-id=271 op=LOAD Jan 21 01:02:08.826000 audit[5486]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=2658 pid=5486 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 01:02:08.826000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3563636238343031383861613232623132663065316131636338313266 Jan 21 01:02:08.826000 audit: BPF prog-id=271 op=UNLOAD Jan 21 01:02:08.826000 audit[5486]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2658 pid=5486 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 01:02:08.826000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3563636238343031383861613232623132663065316131636338313266 Jan 21 01:02:08.826000 audit: BPF prog-id=270 op=UNLOAD Jan 21 01:02:08.826000 audit[5486]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2658 pid=5486 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 01:02:08.826000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3563636238343031383861613232623132663065316131636338313266 Jan 21 01:02:08.826000 audit: BPF prog-id=272 op=LOAD Jan 21 01:02:08.826000 audit[5486]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=2658 pid=5486 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 21 01:02:08.826000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3563636238343031383861613232623132663065316131636338313266 Jan 21 01:02:08.856873 containerd[1680]: time="2026-01-21T01:02:08.856755795Z" level=info msg="StartContainer for \"c5b05b7157fb68a39c586f3a1b399ab17852a73795b8758a4167beb2eb3561a9\" returns successfully" Jan 21 01:02:08.868840 containerd[1680]: time="2026-01-21T01:02:08.868800791Z" level=info msg="StartContainer for \"9e5aa945864f9c065a3065281a0c5f60215f09441d0e1b6e8b21471b9895115d\" returns successfully" Jan 21 01:02:08.885867 containerd[1680]: time="2026-01-21T01:02:08.885763012Z" level=info msg="StartContainer for \"5ccb840188aa22b12f0e1a1cc812f4a95a282c88807aad59b764430ef2fe765e\" returns successfully" Jan 21 01:02:09.214836 kubelet[2901]: E0121 01:02:09.214784 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-64874dbd99-cd26r" podUID="9f88865d-a485-4a8a-b0f8-118c9593a73c" Jan 21 01:02:09.215197 containerd[1680]: time="2026-01-21T01:02:09.215015958Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 21 01:02:09.562247 containerd[1680]: time="2026-01-21T01:02:09.562035861Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 21 01:02:09.564080 containerd[1680]: time="2026-01-21T01:02:09.563972895Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 21 01:02:09.564080 containerd[1680]: time="2026-01-21T01:02:09.564055158Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 21 01:02:09.564239 kubelet[2901]: E0121 01:02:09.564174 2901 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 21 01:02:09.564239 kubelet[2901]: E0121 01:02:09.564219 2901 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 21 01:02:09.564372 kubelet[2901]: E0121 01:02:09.564324 2901 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kvxfg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-8lkb9_calico-system(851d5829-334a-4f46-97de-87be973a0b77): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 21 01:02:09.566204 containerd[1680]: time="2026-01-21T01:02:09.566028163Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 21 01:02:09.677765 kubelet[2901]: E0121 01:02:09.677635 2901 event.go:359] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.5.74:42366->10.0.5.62:2379: read: connection timed out" event="&Event{ObjectMeta:{calico-apiserver-54cbb8896b-89fcv.188c9927fc26d986 calico-apiserver 1289 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:calico-apiserver,Name:calico-apiserver-54cbb8896b-89fcv,UID:557eba89-d604-4304-afae-f0e623ef8722,APIVersion:v1,ResourceVersion:777,FieldPath:spec.containers{calico-apiserver},},Reason:Pulling,Message:Pulling image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\",Source:EventSource{Component:kubelet,Host:ci-4547-0-0-n-af1f1f5a24,},FirstTimestamp:2026-01-21 00:59:10 +0000 UTC,LastTimestamp:2026-01-21 01:01:59.212006564 +0000 UTC m=+214.152622161,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4547-0-0-n-af1f1f5a24,}" Jan 21 01:02:09.901088 containerd[1680]: time="2026-01-21T01:02:09.900907030Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 21 01:02:09.902662 containerd[1680]: time="2026-01-21T01:02:09.902575379Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 21 01:02:09.903691 containerd[1680]: time="2026-01-21T01:02:09.902735759Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 21 01:02:09.903917 kubelet[2901]: E0121 01:02:09.903859 2901 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 21 01:02:09.903917 kubelet[2901]: E0121 01:02:09.903903 2901 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 21 01:02:09.904361 kubelet[2901]: E0121 01:02:09.904321 2901 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kvxfg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-8lkb9_calico-system(851d5829-334a-4f46-97de-87be973a0b77): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 21 01:02:09.905518 kubelet[2901]: E0121 01:02:09.905489 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8lkb9" podUID="851d5829-334a-4f46-97de-87be973a0b77" Jan 21 01:02:10.211799 kubelet[2901]: E0121 01:02:10.211752 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-54cbb8896b-89fcv" podUID="557eba89-d604-4304-afae-f0e623ef8722" Jan 21 01:02:10.766026 kubelet[2901]: I0121 01:02:10.765958 2901 status_manager.go:890] "Failed to get status for pod" podUID="a5089f61-d17c-42c3-bc87-ead8c34f0198" pod="tigera-operator/tigera-operator-7dcd859c48-fvzpt" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.5.74:42480->10.0.5.62:2379: read: connection timed out" Jan 21 01:02:15.212922 containerd[1680]: time="2026-01-21T01:02:15.212642974Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 21 01:02:15.567775 containerd[1680]: time="2026-01-21T01:02:15.567630569Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 21 01:02:15.570528 containerd[1680]: time="2026-01-21T01:02:15.570435242Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 21 01:02:15.570611 containerd[1680]: time="2026-01-21T01:02:15.570527502Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 21 01:02:15.570814 kubelet[2901]: E0121 01:02:15.570777 2901 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 21 01:02:15.571078 kubelet[2901]: E0121 01:02:15.570825 2901 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 21 01:02:15.571078 kubelet[2901]: E0121 01:02:15.570951 2901 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q6z62,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-bc7877fd9-wf2hg_calico-system(59ce7fac-1a6e-4ec4-b99e-063ed3e3444c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 21 01:02:15.572416 kubelet[2901]: E0121 01:02:15.572376 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-bc7877fd9-wf2hg" podUID="59ce7fac-1a6e-4ec4-b99e-063ed3e3444c" Jan 21 01:02:16.990641 update_engine[1654]: I20260121 01:02:16.990125 1654 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jan 21 01:02:16.990641 update_engine[1654]: I20260121 01:02:16.990222 1654 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jan 21 01:02:16.990641 update_engine[1654]: I20260121 01:02:16.990547 1654 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jan 21 01:02:16.996952 update_engine[1654]: E20260121 01:02:16.996909 1654 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled (Domain name not found) Jan 21 01:02:16.997156 update_engine[1654]: I20260121 01:02:16.997133 1654 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 Jan 21 01:02:17.211503 kubelet[2901]: E0121 01:02:17.211372 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-95c8v" podUID="be64252f-80c4-46f6-a3d4-52a6471b1a63" Jan 21 01:02:17.696174 kubelet[2901]: E0121 01:02:17.695740 2901 controller.go:195] "Failed to update lease" err="Put \"https://10.0.5.74:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4547-0-0-n-af1f1f5a24?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 01:02:18.211294 kubelet[2901]: E0121 01:02:18.211227 2901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-54cbb8896b-hxx9z" podUID="22543a80-9d55-4110-b0da-aa35bd7688e7"