Jan 14 00:58:02.925056 kernel: Linux version 6.12.65-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.1_p20250801 p4) 14.3.1 20250801, GNU ld (Gentoo 2.45 p3) 2.45.0) #1 SMP PREEMPT_DYNAMIC Tue Jan 13 22:15:29 -00 2026 Jan 14 00:58:02.925089 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack verity.usrhash=6d34ab71a3dc5a0ab37eb2c851228af18a1e24f648223df9a1099dbd7db2cfcf Jan 14 00:58:02.925099 kernel: BIOS-provided physical RAM map: Jan 14 00:58:02.925106 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ffff] usable Jan 14 00:58:02.925112 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000007fffff] usable Jan 14 00:58:02.925118 kernel: BIOS-e820: [mem 0x0000000000800000-0x0000000000807fff] ACPI NVS Jan 14 00:58:02.925128 kernel: BIOS-e820: [mem 0x0000000000808000-0x000000000080afff] usable Jan 14 00:58:02.925134 kernel: BIOS-e820: [mem 0x000000000080b000-0x000000000080bfff] ACPI NVS Jan 14 00:58:02.925141 kernel: BIOS-e820: [mem 0x000000000080c000-0x0000000000810fff] usable Jan 14 00:58:02.925147 kernel: BIOS-e820: [mem 0x0000000000811000-0x00000000008fffff] ACPI NVS Jan 14 00:58:02.925154 kernel: BIOS-e820: [mem 0x0000000000900000-0x000000007e93efff] usable Jan 14 00:58:02.925160 kernel: BIOS-e820: [mem 0x000000007e93f000-0x000000007e9fffff] reserved Jan 14 00:58:02.925167 kernel: BIOS-e820: [mem 0x000000007ea00000-0x000000007ec70fff] usable Jan 14 00:58:02.925173 kernel: BIOS-e820: [mem 0x000000007ec71000-0x000000007ed84fff] reserved Jan 14 00:58:02.925183 kernel: BIOS-e820: [mem 0x000000007ed85000-0x000000007f8ecfff] usable Jan 14 00:58:02.925190 kernel: BIOS-e820: [mem 0x000000007f8ed000-0x000000007fb6cfff] reserved Jan 14 00:58:02.925197 kernel: BIOS-e820: [mem 0x000000007fb6d000-0x000000007fb7efff] ACPI data Jan 14 00:58:02.925204 kernel: BIOS-e820: [mem 0x000000007fb7f000-0x000000007fbfefff] ACPI NVS Jan 14 00:58:02.925210 kernel: BIOS-e820: [mem 0x000000007fbff000-0x000000007feaefff] usable Jan 14 00:58:02.925217 kernel: BIOS-e820: [mem 0x000000007feaf000-0x000000007feb2fff] reserved Jan 14 00:58:02.925225 kernel: BIOS-e820: [mem 0x000000007feb3000-0x000000007feb4fff] ACPI NVS Jan 14 00:58:02.925232 kernel: BIOS-e820: [mem 0x000000007feb5000-0x000000007feebfff] usable Jan 14 00:58:02.925238 kernel: BIOS-e820: [mem 0x000000007feec000-0x000000007ff6ffff] reserved Jan 14 00:58:02.925245 kernel: BIOS-e820: [mem 0x000000007ff70000-0x000000007fffffff] ACPI NVS Jan 14 00:58:02.925251 kernel: BIOS-e820: [mem 0x00000000e0000000-0x00000000efffffff] reserved Jan 14 00:58:02.925258 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Jan 14 00:58:02.925265 kernel: BIOS-e820: [mem 0x00000000ffc00000-0x00000000ffffffff] reserved Jan 14 00:58:02.925271 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000017fffffff] usable Jan 14 00:58:02.925278 kernel: NX (Execute Disable) protection: active Jan 14 00:58:02.925284 kernel: APIC: Static calls initialized Jan 14 00:58:02.925291 kernel: e820: update [mem 0x7df7f018-0x7df88a57] usable ==> usable Jan 14 00:58:02.925301 kernel: e820: update [mem 0x7df57018-0x7df7e457] usable ==> usable Jan 14 00:58:02.925308 kernel: extended physical RAM map: Jan 14 00:58:02.925314 kernel: reserve setup_data: [mem 0x0000000000000000-0x000000000009ffff] usable Jan 14 00:58:02.925321 kernel: reserve setup_data: [mem 0x0000000000100000-0x00000000007fffff] usable Jan 14 00:58:02.925328 kernel: reserve setup_data: [mem 0x0000000000800000-0x0000000000807fff] ACPI NVS Jan 14 00:58:02.925335 kernel: reserve setup_data: [mem 0x0000000000808000-0x000000000080afff] usable Jan 14 00:58:02.925341 kernel: reserve setup_data: [mem 0x000000000080b000-0x000000000080bfff] ACPI NVS Jan 14 00:58:02.925348 kernel: reserve setup_data: [mem 0x000000000080c000-0x0000000000810fff] usable Jan 14 00:58:02.925355 kernel: reserve setup_data: [mem 0x0000000000811000-0x00000000008fffff] ACPI NVS Jan 14 00:58:02.925367 kernel: reserve setup_data: [mem 0x0000000000900000-0x000000007df57017] usable Jan 14 00:58:02.925374 kernel: reserve setup_data: [mem 0x000000007df57018-0x000000007df7e457] usable Jan 14 00:58:02.925380 kernel: reserve setup_data: [mem 0x000000007df7e458-0x000000007df7f017] usable Jan 14 00:58:02.925387 kernel: reserve setup_data: [mem 0x000000007df7f018-0x000000007df88a57] usable Jan 14 00:58:02.925396 kernel: reserve setup_data: [mem 0x000000007df88a58-0x000000007e93efff] usable Jan 14 00:58:02.925402 kernel: reserve setup_data: [mem 0x000000007e93f000-0x000000007e9fffff] reserved Jan 14 00:58:02.925409 kernel: reserve setup_data: [mem 0x000000007ea00000-0x000000007ec70fff] usable Jan 14 00:58:02.925416 kernel: reserve setup_data: [mem 0x000000007ec71000-0x000000007ed84fff] reserved Jan 14 00:58:02.925423 kernel: reserve setup_data: [mem 0x000000007ed85000-0x000000007f8ecfff] usable Jan 14 00:58:02.925430 kernel: reserve setup_data: [mem 0x000000007f8ed000-0x000000007fb6cfff] reserved Jan 14 00:58:02.925436 kernel: reserve setup_data: [mem 0x000000007fb6d000-0x000000007fb7efff] ACPI data Jan 14 00:58:02.925443 kernel: reserve setup_data: [mem 0x000000007fb7f000-0x000000007fbfefff] ACPI NVS Jan 14 00:58:02.925450 kernel: reserve setup_data: [mem 0x000000007fbff000-0x000000007feaefff] usable Jan 14 00:58:02.925457 kernel: reserve setup_data: [mem 0x000000007feaf000-0x000000007feb2fff] reserved Jan 14 00:58:02.925463 kernel: reserve setup_data: [mem 0x000000007feb3000-0x000000007feb4fff] ACPI NVS Jan 14 00:58:02.925473 kernel: reserve setup_data: [mem 0x000000007feb5000-0x000000007feebfff] usable Jan 14 00:58:02.925479 kernel: reserve setup_data: [mem 0x000000007feec000-0x000000007ff6ffff] reserved Jan 14 00:58:02.925486 kernel: reserve setup_data: [mem 0x000000007ff70000-0x000000007fffffff] ACPI NVS Jan 14 00:58:02.925493 kernel: reserve setup_data: [mem 0x00000000e0000000-0x00000000efffffff] reserved Jan 14 00:58:02.925500 kernel: reserve setup_data: [mem 0x00000000feffc000-0x00000000feffffff] reserved Jan 14 00:58:02.925506 kernel: reserve setup_data: [mem 0x00000000ffc00000-0x00000000ffffffff] reserved Jan 14 00:58:02.925513 kernel: reserve setup_data: [mem 0x0000000100000000-0x000000017fffffff] usable Jan 14 00:58:02.925520 kernel: efi: EFI v2.7 by EDK II Jan 14 00:58:02.925527 kernel: efi: SMBIOS=0x7f972000 ACPI=0x7fb7e000 ACPI 2.0=0x7fb7e014 MEMATTR=0x7dfd8018 RNG=0x7fb72018 Jan 14 00:58:02.925534 kernel: random: crng init done Jan 14 00:58:02.925541 kernel: efi: Remove mem139: MMIO range=[0xffc00000-0xffffffff] (4MB) from e820 map Jan 14 00:58:02.925549 kernel: e820: remove [mem 0xffc00000-0xffffffff] reserved Jan 14 00:58:02.925556 kernel: secureboot: Secure boot disabled Jan 14 00:58:02.925563 kernel: SMBIOS 2.8 present. Jan 14 00:58:02.925569 kernel: DMI: STACKIT Cloud OpenStack Nova/Standard PC (Q35 + ICH9, 2009), BIOS unknown 02/02/2022 Jan 14 00:58:02.925576 kernel: DMI: Memory slots populated: 1/1 Jan 14 00:58:02.925583 kernel: Hypervisor detected: KVM Jan 14 00:58:02.925590 kernel: last_pfn = 0x7feec max_arch_pfn = 0x10000000000 Jan 14 00:58:02.925597 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Jan 14 00:58:02.925604 kernel: kvm-clock: using sched offset of 5452117986 cycles Jan 14 00:58:02.925612 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Jan 14 00:58:02.925621 kernel: tsc: Detected 2294.578 MHz processor Jan 14 00:58:02.925628 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Jan 14 00:58:02.925636 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Jan 14 00:58:02.925643 kernel: last_pfn = 0x180000 max_arch_pfn = 0x10000000000 Jan 14 00:58:02.925650 kernel: MTRR map: 4 entries (2 fixed + 2 variable; max 18), built from 8 variable MTRRs Jan 14 00:58:02.925658 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Jan 14 00:58:02.925665 kernel: last_pfn = 0x7feec max_arch_pfn = 0x10000000000 Jan 14 00:58:02.925672 kernel: Using GB pages for direct mapping Jan 14 00:58:02.925681 kernel: ACPI: Early table checksum verification disabled Jan 14 00:58:02.925689 kernel: ACPI: RSDP 0x000000007FB7E014 000024 (v02 BOCHS ) Jan 14 00:58:02.925696 kernel: ACPI: XSDT 0x000000007FB7D0E8 00004C (v01 BOCHS BXPC 00000001 01000013) Jan 14 00:58:02.925704 kernel: ACPI: FACP 0x000000007FB77000 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jan 14 00:58:02.925711 kernel: ACPI: DSDT 0x000000007FB78000 00423C (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 14 00:58:02.925718 kernel: ACPI: FACS 0x000000007FBDD000 000040 Jan 14 00:58:02.925725 kernel: ACPI: APIC 0x000000007FB76000 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jan 14 00:58:02.925735 kernel: ACPI: MCFG 0x000000007FB75000 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 14 00:58:02.925742 kernel: ACPI: WAET 0x000000007FB74000 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 14 00:58:02.925749 kernel: ACPI: BGRT 0x000000007FB73000 000038 (v01 INTEL EDK2 00000002 01000013) Jan 14 00:58:02.925756 kernel: ACPI: Reserving FACP table memory at [mem 0x7fb77000-0x7fb770f3] Jan 14 00:58:02.925764 kernel: ACPI: Reserving DSDT table memory at [mem 0x7fb78000-0x7fb7c23b] Jan 14 00:58:02.925771 kernel: ACPI: Reserving FACS table memory at [mem 0x7fbdd000-0x7fbdd03f] Jan 14 00:58:02.925778 kernel: ACPI: Reserving APIC table memory at [mem 0x7fb76000-0x7fb7607f] Jan 14 00:58:02.925787 kernel: ACPI: Reserving MCFG table memory at [mem 0x7fb75000-0x7fb7503b] Jan 14 00:58:02.925794 kernel: ACPI: Reserving WAET table memory at [mem 0x7fb74000-0x7fb74027] Jan 14 00:58:02.925801 kernel: ACPI: Reserving BGRT table memory at [mem 0x7fb73000-0x7fb73037] Jan 14 00:58:02.925809 kernel: No NUMA configuration found Jan 14 00:58:02.925816 kernel: Faking a node at [mem 0x0000000000000000-0x000000017fffffff] Jan 14 00:58:02.925823 kernel: NODE_DATA(0) allocated [mem 0x17fff8dc0-0x17fffffff] Jan 14 00:58:02.925830 kernel: Zone ranges: Jan 14 00:58:02.925839 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Jan 14 00:58:02.925847 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] Jan 14 00:58:02.925854 kernel: Normal [mem 0x0000000100000000-0x000000017fffffff] Jan 14 00:58:02.925861 kernel: Device empty Jan 14 00:58:02.925868 kernel: Movable zone start for each node Jan 14 00:58:02.925875 kernel: Early memory node ranges Jan 14 00:58:02.925883 kernel: node 0: [mem 0x0000000000001000-0x000000000009ffff] Jan 14 00:58:02.925890 kernel: node 0: [mem 0x0000000000100000-0x00000000007fffff] Jan 14 00:58:02.925898 kernel: node 0: [mem 0x0000000000808000-0x000000000080afff] Jan 14 00:58:02.925906 kernel: node 0: [mem 0x000000000080c000-0x0000000000810fff] Jan 14 00:58:02.925913 kernel: node 0: [mem 0x0000000000900000-0x000000007e93efff] Jan 14 00:58:02.925920 kernel: node 0: [mem 0x000000007ea00000-0x000000007ec70fff] Jan 14 00:58:02.925927 kernel: node 0: [mem 0x000000007ed85000-0x000000007f8ecfff] Jan 14 00:58:02.925942 kernel: node 0: [mem 0x000000007fbff000-0x000000007feaefff] Jan 14 00:58:02.925950 kernel: node 0: [mem 0x000000007feb5000-0x000000007feebfff] Jan 14 00:58:02.925957 kernel: node 0: [mem 0x0000000100000000-0x000000017fffffff] Jan 14 00:58:02.925965 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000017fffffff] Jan 14 00:58:02.925973 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Jan 14 00:58:02.925982 kernel: On node 0, zone DMA: 96 pages in unavailable ranges Jan 14 00:58:02.925990 kernel: On node 0, zone DMA: 8 pages in unavailable ranges Jan 14 00:58:02.925998 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Jan 14 00:58:02.926005 kernel: On node 0, zone DMA: 239 pages in unavailable ranges Jan 14 00:58:02.926015 kernel: On node 0, zone DMA32: 193 pages in unavailable ranges Jan 14 00:58:02.926023 kernel: On node 0, zone DMA32: 276 pages in unavailable ranges Jan 14 00:58:02.926031 kernel: On node 0, zone DMA32: 786 pages in unavailable ranges Jan 14 00:58:02.926039 kernel: On node 0, zone DMA32: 6 pages in unavailable ranges Jan 14 00:58:02.928070 kernel: On node 0, zone Normal: 276 pages in unavailable ranges Jan 14 00:58:02.928084 kernel: ACPI: PM-Timer IO Port: 0x608 Jan 14 00:58:02.928092 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Jan 14 00:58:02.928103 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Jan 14 00:58:02.928110 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Jan 14 00:58:02.928117 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Jan 14 00:58:02.928124 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Jan 14 00:58:02.928131 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Jan 14 00:58:02.928138 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Jan 14 00:58:02.928145 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Jan 14 00:58:02.928152 kernel: TSC deadline timer available Jan 14 00:58:02.928161 kernel: CPU topo: Max. logical packages: 2 Jan 14 00:58:02.928168 kernel: CPU topo: Max. logical dies: 2 Jan 14 00:58:02.928175 kernel: CPU topo: Max. dies per package: 1 Jan 14 00:58:02.928182 kernel: CPU topo: Max. threads per core: 1 Jan 14 00:58:02.928189 kernel: CPU topo: Num. cores per package: 1 Jan 14 00:58:02.928196 kernel: CPU topo: Num. threads per package: 1 Jan 14 00:58:02.928203 kernel: CPU topo: Allowing 2 present CPUs plus 0 hotplug CPUs Jan 14 00:58:02.928212 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Jan 14 00:58:02.928219 kernel: kvm-guest: KVM setup pv remote TLB flush Jan 14 00:58:02.928226 kernel: kvm-guest: setup PV sched yield Jan 14 00:58:02.928234 kernel: [mem 0x80000000-0xdfffffff] available for PCI devices Jan 14 00:58:02.928241 kernel: Booting paravirtualized kernel on KVM Jan 14 00:58:02.928248 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Jan 14 00:58:02.928256 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 Jan 14 00:58:02.928263 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u1048576 Jan 14 00:58:02.928272 kernel: pcpu-alloc: s207832 r8192 d29736 u1048576 alloc=1*2097152 Jan 14 00:58:02.928279 kernel: pcpu-alloc: [0] 0 1 Jan 14 00:58:02.928286 kernel: kvm-guest: PV spinlocks enabled Jan 14 00:58:02.928294 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Jan 14 00:58:02.928302 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack verity.usrhash=6d34ab71a3dc5a0ab37eb2c851228af18a1e24f648223df9a1099dbd7db2cfcf Jan 14 00:58:02.928310 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Jan 14 00:58:02.928326 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Jan 14 00:58:02.928334 kernel: Fallback order for Node 0: 0 Jan 14 00:58:02.928341 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1046694 Jan 14 00:58:02.928348 kernel: Policy zone: Normal Jan 14 00:58:02.928356 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Jan 14 00:58:02.928363 kernel: software IO TLB: area num 2. Jan 14 00:58:02.928371 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Jan 14 00:58:02.928380 kernel: ftrace: allocating 40097 entries in 157 pages Jan 14 00:58:02.928388 kernel: ftrace: allocated 157 pages with 5 groups Jan 14 00:58:02.928396 kernel: Dynamic Preempt: voluntary Jan 14 00:58:02.928404 kernel: rcu: Preemptible hierarchical RCU implementation. Jan 14 00:58:02.928413 kernel: rcu: RCU event tracing is enabled. Jan 14 00:58:02.928421 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Jan 14 00:58:02.928429 kernel: Trampoline variant of Tasks RCU enabled. Jan 14 00:58:02.928436 kernel: Rude variant of Tasks RCU enabled. Jan 14 00:58:02.928445 kernel: Tracing variant of Tasks RCU enabled. Jan 14 00:58:02.928453 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Jan 14 00:58:02.928461 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Jan 14 00:58:02.928468 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jan 14 00:58:02.928476 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jan 14 00:58:02.928484 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jan 14 00:58:02.928492 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 16 Jan 14 00:58:02.928501 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Jan 14 00:58:02.928509 kernel: Console: colour dummy device 80x25 Jan 14 00:58:02.928517 kernel: printk: legacy console [tty0] enabled Jan 14 00:58:02.928524 kernel: printk: legacy console [ttyS0] enabled Jan 14 00:58:02.928532 kernel: ACPI: Core revision 20240827 Jan 14 00:58:02.928539 kernel: APIC: Switch to symmetric I/O mode setup Jan 14 00:58:02.928547 kernel: x2apic enabled Jan 14 00:58:02.928556 kernel: APIC: Switched APIC routing to: physical x2apic Jan 14 00:58:02.928564 kernel: kvm-guest: APIC: send_IPI_mask() replaced with kvm_send_ipi_mask() Jan 14 00:58:02.928572 kernel: kvm-guest: APIC: send_IPI_mask_allbutself() replaced with kvm_send_ipi_mask_allbutself() Jan 14 00:58:02.928580 kernel: kvm-guest: setup PV IPIs Jan 14 00:58:02.928587 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x2113334dc36, max_idle_ns: 440795272915 ns Jan 14 00:58:02.928595 kernel: Calibrating delay loop (skipped) preset value.. 4589.15 BogoMIPS (lpj=2294578) Jan 14 00:58:02.928602 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Jan 14 00:58:02.928611 kernel: Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 Jan 14 00:58:02.928618 kernel: Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 Jan 14 00:58:02.928624 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Jan 14 00:58:02.928631 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on syscall and VM exit Jan 14 00:58:02.928638 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Jan 14 00:58:02.928645 kernel: Spectre V2 : Spectre v2 / PBRSB-eIBRS: Retire a single CALL on VMEXIT Jan 14 00:58:02.928652 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Jan 14 00:58:02.928658 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Jan 14 00:58:02.928665 kernel: TAA: Mitigation: Clear CPU buffers Jan 14 00:58:02.928671 kernel: MMIO Stale Data: Mitigation: Clear CPU buffers Jan 14 00:58:02.928680 kernel: active return thunk: its_return_thunk Jan 14 00:58:02.928686 kernel: ITS: Mitigation: Aligned branch/return thunks Jan 14 00:58:02.928693 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Jan 14 00:58:02.928700 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Jan 14 00:58:02.928707 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Jan 14 00:58:02.928713 kernel: x86/fpu: Supporting XSAVE feature 0x020: 'AVX-512 opmask' Jan 14 00:58:02.928720 kernel: x86/fpu: Supporting XSAVE feature 0x040: 'AVX-512 Hi256' Jan 14 00:58:02.928727 kernel: x86/fpu: Supporting XSAVE feature 0x080: 'AVX-512 ZMM_Hi256' Jan 14 00:58:02.928733 kernel: x86/fpu: Supporting XSAVE feature 0x200: 'Protection Keys User registers' Jan 14 00:58:02.928740 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Jan 14 00:58:02.928748 kernel: x86/fpu: xstate_offset[5]: 832, xstate_sizes[5]: 64 Jan 14 00:58:02.928755 kernel: x86/fpu: xstate_offset[6]: 896, xstate_sizes[6]: 512 Jan 14 00:58:02.928762 kernel: x86/fpu: xstate_offset[7]: 1408, xstate_sizes[7]: 1024 Jan 14 00:58:02.928770 kernel: x86/fpu: xstate_offset[9]: 2432, xstate_sizes[9]: 8 Jan 14 00:58:02.928777 kernel: x86/fpu: Enabled xstate features 0x2e7, context size is 2440 bytes, using 'compacted' format. Jan 14 00:58:02.928784 kernel: Freeing SMP alternatives memory: 32K Jan 14 00:58:02.928791 kernel: pid_max: default: 32768 minimum: 301 Jan 14 00:58:02.928798 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Jan 14 00:58:02.928805 kernel: landlock: Up and running. Jan 14 00:58:02.928812 kernel: SELinux: Initializing. Jan 14 00:58:02.928819 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Jan 14 00:58:02.928828 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Jan 14 00:58:02.928835 kernel: smpboot: CPU0: Intel(R) Xeon(R) Silver 4316 CPU @ 2.30GHz (family: 0x6, model: 0x6a, stepping: 0x6) Jan 14 00:58:02.928843 kernel: Performance Events: PEBS fmt0-, Icelake events, full-width counters, Intel PMU driver. Jan 14 00:58:02.928850 kernel: ... version: 2 Jan 14 00:58:02.928858 kernel: ... bit width: 48 Jan 14 00:58:02.928865 kernel: ... generic registers: 8 Jan 14 00:58:02.928872 kernel: ... value mask: 0000ffffffffffff Jan 14 00:58:02.928879 kernel: ... max period: 00007fffffffffff Jan 14 00:58:02.928888 kernel: ... fixed-purpose events: 3 Jan 14 00:58:02.928895 kernel: ... event mask: 00000007000000ff Jan 14 00:58:02.928902 kernel: signal: max sigframe size: 3632 Jan 14 00:58:02.928909 kernel: rcu: Hierarchical SRCU implementation. Jan 14 00:58:02.928916 kernel: rcu: Max phase no-delay instances is 400. Jan 14 00:58:02.928924 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Jan 14 00:58:02.928931 kernel: smp: Bringing up secondary CPUs ... Jan 14 00:58:02.928940 kernel: smpboot: x86: Booting SMP configuration: Jan 14 00:58:02.928947 kernel: .... node #0, CPUs: #1 Jan 14 00:58:02.928954 kernel: smp: Brought up 1 node, 2 CPUs Jan 14 00:58:02.928961 kernel: smpboot: Total of 2 processors activated (9178.31 BogoMIPS) Jan 14 00:58:02.928968 kernel: Memory: 3969764K/4186776K available (14336K kernel code, 2445K rwdata, 31636K rodata, 15536K init, 2504K bss, 212132K reserved, 0K cma-reserved) Jan 14 00:58:02.928975 kernel: devtmpfs: initialized Jan 14 00:58:02.928983 kernel: x86/mm: Memory block size: 128MB Jan 14 00:58:02.928992 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x00800000-0x00807fff] (32768 bytes) Jan 14 00:58:02.928999 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x0080b000-0x0080bfff] (4096 bytes) Jan 14 00:58:02.929006 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x00811000-0x008fffff] (978944 bytes) Jan 14 00:58:02.929013 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7fb7f000-0x7fbfefff] (524288 bytes) Jan 14 00:58:02.929021 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7feb3000-0x7feb4fff] (8192 bytes) Jan 14 00:58:02.929028 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7ff70000-0x7fffffff] (589824 bytes) Jan 14 00:58:02.929036 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Jan 14 00:58:02.929044 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Jan 14 00:58:02.929058 kernel: pinctrl core: initialized pinctrl subsystem Jan 14 00:58:02.929065 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Jan 14 00:58:02.929073 kernel: audit: initializing netlink subsys (disabled) Jan 14 00:58:02.929081 kernel: audit: type=2000 audit(1768352279.753:1): state=initialized audit_enabled=0 res=1 Jan 14 00:58:02.929088 kernel: thermal_sys: Registered thermal governor 'step_wise' Jan 14 00:58:02.929096 kernel: thermal_sys: Registered thermal governor 'user_space' Jan 14 00:58:02.929105 kernel: cpuidle: using governor menu Jan 14 00:58:02.929113 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Jan 14 00:58:02.929121 kernel: dca service started, version 1.12.1 Jan 14 00:58:02.929129 kernel: PCI: ECAM [mem 0xe0000000-0xefffffff] (base 0xe0000000) for domain 0000 [bus 00-ff] Jan 14 00:58:02.929137 kernel: PCI: Using configuration type 1 for base access Jan 14 00:58:02.929145 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Jan 14 00:58:02.929152 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Jan 14 00:58:02.929162 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Jan 14 00:58:02.929170 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Jan 14 00:58:02.929178 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Jan 14 00:58:02.929189 kernel: ACPI: Added _OSI(Module Device) Jan 14 00:58:02.929203 kernel: ACPI: Added _OSI(Processor Device) Jan 14 00:58:02.929211 kernel: ACPI: Added _OSI(Processor Aggregator Device) Jan 14 00:58:02.929219 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Jan 14 00:58:02.929227 kernel: ACPI: Interpreter enabled Jan 14 00:58:02.929237 kernel: ACPI: PM: (supports S0 S3 S5) Jan 14 00:58:02.929245 kernel: ACPI: Using IOAPIC for interrupt routing Jan 14 00:58:02.929254 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Jan 14 00:58:02.929262 kernel: PCI: Using E820 reservations for host bridge windows Jan 14 00:58:02.929273 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Jan 14 00:58:02.929281 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Jan 14 00:58:02.929462 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Jan 14 00:58:02.929577 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] Jan 14 00:58:02.929675 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] Jan 14 00:58:02.929685 kernel: PCI host bridge to bus 0000:00 Jan 14 00:58:02.929782 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Jan 14 00:58:02.929871 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Jan 14 00:58:02.929959 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Jan 14 00:58:02.930058 kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xdfffffff window] Jan 14 00:58:02.930149 kernel: pci_bus 0000:00: root bus resource [mem 0xf0000000-0xfebfffff window] Jan 14 00:58:02.930235 kernel: pci_bus 0000:00: root bus resource [mem 0x380000000000-0x38e800003fff window] Jan 14 00:58:02.930321 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Jan 14 00:58:02.930445 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 conventional PCI endpoint Jan 14 00:58:02.930550 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint Jan 14 00:58:02.930642 kernel: pci 0000:00:01.0: BAR 0 [mem 0x80000000-0x807fffff pref] Jan 14 00:58:02.930735 kernel: pci 0000:00:01.0: BAR 2 [mem 0x38e800000000-0x38e800003fff 64bit pref] Jan 14 00:58:02.930824 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8439e000-0x8439efff] Jan 14 00:58:02.930912 kernel: pci 0000:00:01.0: ROM [mem 0xffff0000-0xffffffff pref] Jan 14 00:58:02.931007 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Jan 14 00:58:02.931114 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 00:58:02.931210 kernel: pci 0000:00:02.0: BAR 0 [mem 0x8439d000-0x8439dfff] Jan 14 00:58:02.931306 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Jan 14 00:58:02.931400 kernel: pci 0000:00:02.0: bridge window [io 0x6000-0x6fff] Jan 14 00:58:02.931491 kernel: pci 0000:00:02.0: bridge window [mem 0x84000000-0x842fffff] Jan 14 00:58:02.931583 kernel: pci 0000:00:02.0: bridge window [mem 0x380000000000-0x3807ffffffff 64bit pref] Jan 14 00:58:02.931681 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 00:58:02.931771 kernel: pci 0000:00:02.1: BAR 0 [mem 0x8439c000-0x8439cfff] Jan 14 00:58:02.931860 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Jan 14 00:58:02.931947 kernel: pci 0000:00:02.1: bridge window [mem 0x83e00000-0x83ffffff] Jan 14 00:58:02.932039 kernel: pci 0000:00:02.1: bridge window [mem 0x380800000000-0x380fffffffff 64bit pref] Jan 14 00:58:02.932148 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 00:58:02.932239 kernel: pci 0000:00:02.2: BAR 0 [mem 0x8439b000-0x8439bfff] Jan 14 00:58:02.932327 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Jan 14 00:58:02.932415 kernel: pci 0000:00:02.2: bridge window [mem 0x83c00000-0x83dfffff] Jan 14 00:58:02.932504 kernel: pci 0000:00:02.2: bridge window [mem 0x381000000000-0x3817ffffffff 64bit pref] Jan 14 00:58:02.932937 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 00:58:02.933032 kernel: pci 0000:00:02.3: BAR 0 [mem 0x8439a000-0x8439afff] Jan 14 00:58:02.933134 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Jan 14 00:58:02.933225 kernel: pci 0000:00:02.3: bridge window [mem 0x83a00000-0x83bfffff] Jan 14 00:58:02.933318 kernel: pci 0000:00:02.3: bridge window [mem 0x381800000000-0x381fffffffff 64bit pref] Jan 14 00:58:02.933413 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 00:58:02.933507 kernel: pci 0000:00:02.4: BAR 0 [mem 0x84399000-0x84399fff] Jan 14 00:58:02.933596 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Jan 14 00:58:02.933685 kernel: pci 0000:00:02.4: bridge window [mem 0x83800000-0x839fffff] Jan 14 00:58:02.933775 kernel: pci 0000:00:02.4: bridge window [mem 0x382000000000-0x3827ffffffff 64bit pref] Jan 14 00:58:02.933872 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 00:58:02.933965 kernel: pci 0000:00:02.5: BAR 0 [mem 0x84398000-0x84398fff] Jan 14 00:58:02.934074 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Jan 14 00:58:02.934166 kernel: pci 0000:00:02.5: bridge window [mem 0x83600000-0x837fffff] Jan 14 00:58:02.934259 kernel: pci 0000:00:02.5: bridge window [mem 0x382800000000-0x382fffffffff 64bit pref] Jan 14 00:58:02.934360 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 00:58:02.934466 kernel: pci 0000:00:02.6: BAR 0 [mem 0x84397000-0x84397fff] Jan 14 00:58:02.934563 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Jan 14 00:58:02.934657 kernel: pci 0000:00:02.6: bridge window [mem 0x83400000-0x835fffff] Jan 14 00:58:02.934747 kernel: pci 0000:00:02.6: bridge window [mem 0x383000000000-0x3837ffffffff 64bit pref] Jan 14 00:58:02.934842 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 00:58:02.934931 kernel: pci 0000:00:02.7: BAR 0 [mem 0x84396000-0x84396fff] Jan 14 00:58:02.935015 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Jan 14 00:58:02.935114 kernel: pci 0000:00:02.7: bridge window [mem 0x83200000-0x833fffff] Jan 14 00:58:02.935204 kernel: pci 0000:00:02.7: bridge window [mem 0x383800000000-0x383fffffffff 64bit pref] Jan 14 00:58:02.935298 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 00:58:02.935389 kernel: pci 0000:00:03.0: BAR 0 [mem 0x84395000-0x84395fff] Jan 14 00:58:02.935479 kernel: pci 0000:00:03.0: PCI bridge to [bus 0a] Jan 14 00:58:02.935566 kernel: pci 0000:00:03.0: bridge window [mem 0x83000000-0x831fffff] Jan 14 00:58:02.935656 kernel: pci 0000:00:03.0: bridge window [mem 0x384000000000-0x3847ffffffff 64bit pref] Jan 14 00:58:02.935753 kernel: pci 0000:00:03.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 00:58:02.935856 kernel: pci 0000:00:03.1: BAR 0 [mem 0x84394000-0x84394fff] Jan 14 00:58:02.935947 kernel: pci 0000:00:03.1: PCI bridge to [bus 0b] Jan 14 00:58:02.936039 kernel: pci 0000:00:03.1: bridge window [mem 0x82e00000-0x82ffffff] Jan 14 00:58:02.936143 kernel: pci 0000:00:03.1: bridge window [mem 0x384800000000-0x384fffffffff 64bit pref] Jan 14 00:58:02.936242 kernel: pci 0000:00:03.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 00:58:02.936333 kernel: pci 0000:00:03.2: BAR 0 [mem 0x84393000-0x84393fff] Jan 14 00:58:02.936425 kernel: pci 0000:00:03.2: PCI bridge to [bus 0c] Jan 14 00:58:02.936516 kernel: pci 0000:00:03.2: bridge window [mem 0x82c00000-0x82dfffff] Jan 14 00:58:02.936607 kernel: pci 0000:00:03.2: bridge window [mem 0x385000000000-0x3857ffffffff 64bit pref] Jan 14 00:58:02.936702 kernel: pci 0000:00:03.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 00:58:02.936797 kernel: pci 0000:00:03.3: BAR 0 [mem 0x84392000-0x84392fff] Jan 14 00:58:02.936891 kernel: pci 0000:00:03.3: PCI bridge to [bus 0d] Jan 14 00:58:02.936982 kernel: pci 0000:00:03.3: bridge window [mem 0x82a00000-0x82bfffff] Jan 14 00:58:02.937077 kernel: pci 0000:00:03.3: bridge window [mem 0x385800000000-0x385fffffffff 64bit pref] Jan 14 00:58:02.937168 kernel: pci 0000:00:03.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 00:58:02.937257 kernel: pci 0000:00:03.4: BAR 0 [mem 0x84391000-0x84391fff] Jan 14 00:58:02.937345 kernel: pci 0000:00:03.4: PCI bridge to [bus 0e] Jan 14 00:58:02.937433 kernel: pci 0000:00:03.4: bridge window [mem 0x82800000-0x829fffff] Jan 14 00:58:02.937524 kernel: pci 0000:00:03.4: bridge window [mem 0x386000000000-0x3867ffffffff 64bit pref] Jan 14 00:58:02.937624 kernel: pci 0000:00:03.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 00:58:02.937715 kernel: pci 0000:00:03.5: BAR 0 [mem 0x84390000-0x84390fff] Jan 14 00:58:02.937805 kernel: pci 0000:00:03.5: PCI bridge to [bus 0f] Jan 14 00:58:02.937898 kernel: pci 0000:00:03.5: bridge window [mem 0x82600000-0x827fffff] Jan 14 00:58:02.937992 kernel: pci 0000:00:03.5: bridge window [mem 0x386800000000-0x386fffffffff 64bit pref] Jan 14 00:58:02.938108 kernel: pci 0000:00:03.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 00:58:02.938226 kernel: pci 0000:00:03.6: BAR 0 [mem 0x8438f000-0x8438ffff] Jan 14 00:58:02.938322 kernel: pci 0000:00:03.6: PCI bridge to [bus 10] Jan 14 00:58:02.938424 kernel: pci 0000:00:03.6: bridge window [mem 0x82400000-0x825fffff] Jan 14 00:58:02.938517 kernel: pci 0000:00:03.6: bridge window [mem 0x387000000000-0x3877ffffffff 64bit pref] Jan 14 00:58:02.938613 kernel: pci 0000:00:03.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 00:58:02.938705 kernel: pci 0000:00:03.7: BAR 0 [mem 0x8438e000-0x8438efff] Jan 14 00:58:02.938797 kernel: pci 0000:00:03.7: PCI bridge to [bus 11] Jan 14 00:58:02.938888 kernel: pci 0000:00:03.7: bridge window [mem 0x82200000-0x823fffff] Jan 14 00:58:02.938979 kernel: pci 0000:00:03.7: bridge window [mem 0x387800000000-0x387fffffffff 64bit pref] Jan 14 00:58:02.939083 kernel: pci 0000:00:04.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 00:58:02.939174 kernel: pci 0000:00:04.0: BAR 0 [mem 0x8438d000-0x8438dfff] Jan 14 00:58:02.939265 kernel: pci 0000:00:04.0: PCI bridge to [bus 12] Jan 14 00:58:02.939360 kernel: pci 0000:00:04.0: bridge window [mem 0x82000000-0x821fffff] Jan 14 00:58:02.939451 kernel: pci 0000:00:04.0: bridge window [mem 0x388000000000-0x3887ffffffff 64bit pref] Jan 14 00:58:02.939547 kernel: pci 0000:00:04.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 00:58:02.939639 kernel: pci 0000:00:04.1: BAR 0 [mem 0x8438c000-0x8438cfff] Jan 14 00:58:02.939731 kernel: pci 0000:00:04.1: PCI bridge to [bus 13] Jan 14 00:58:02.939821 kernel: pci 0000:00:04.1: bridge window [mem 0x81e00000-0x81ffffff] Jan 14 00:58:02.939915 kernel: pci 0000:00:04.1: bridge window [mem 0x388800000000-0x388fffffffff 64bit pref] Jan 14 00:58:02.940012 kernel: pci 0000:00:04.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 00:58:02.940110 kernel: pci 0000:00:04.2: BAR 0 [mem 0x8438b000-0x8438bfff] Jan 14 00:58:02.940199 kernel: pci 0000:00:04.2: PCI bridge to [bus 14] Jan 14 00:58:02.940287 kernel: pci 0000:00:04.2: bridge window [mem 0x81c00000-0x81dfffff] Jan 14 00:58:02.940376 kernel: pci 0000:00:04.2: bridge window [mem 0x389000000000-0x3897ffffffff 64bit pref] Jan 14 00:58:02.940475 kernel: pci 0000:00:04.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 00:58:02.940565 kernel: pci 0000:00:04.3: BAR 0 [mem 0x8438a000-0x8438afff] Jan 14 00:58:02.940655 kernel: pci 0000:00:04.3: PCI bridge to [bus 15] Jan 14 00:58:02.940747 kernel: pci 0000:00:04.3: bridge window [mem 0x81a00000-0x81bfffff] Jan 14 00:58:02.940838 kernel: pci 0000:00:04.3: bridge window [mem 0x389800000000-0x389fffffffff 64bit pref] Jan 14 00:58:02.940934 kernel: pci 0000:00:04.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 00:58:02.941030 kernel: pci 0000:00:04.4: BAR 0 [mem 0x84389000-0x84389fff] Jan 14 00:58:02.941134 kernel: pci 0000:00:04.4: PCI bridge to [bus 16] Jan 14 00:58:02.941223 kernel: pci 0000:00:04.4: bridge window [mem 0x81800000-0x819fffff] Jan 14 00:58:02.941310 kernel: pci 0000:00:04.4: bridge window [mem 0x38a000000000-0x38a7ffffffff 64bit pref] Jan 14 00:58:02.941403 kernel: pci 0000:00:04.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 00:58:02.941495 kernel: pci 0000:00:04.5: BAR 0 [mem 0x84388000-0x84388fff] Jan 14 00:58:02.941584 kernel: pci 0000:00:04.5: PCI bridge to [bus 17] Jan 14 00:58:02.941672 kernel: pci 0000:00:04.5: bridge window [mem 0x81600000-0x817fffff] Jan 14 00:58:02.941760 kernel: pci 0000:00:04.5: bridge window [mem 0x38a800000000-0x38afffffffff 64bit pref] Jan 14 00:58:02.941898 kernel: pci 0000:00:04.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 00:58:02.941989 kernel: pci 0000:00:04.6: BAR 0 [mem 0x84387000-0x84387fff] Jan 14 00:58:02.942089 kernel: pci 0000:00:04.6: PCI bridge to [bus 18] Jan 14 00:58:02.942181 kernel: pci 0000:00:04.6: bridge window [mem 0x81400000-0x815fffff] Jan 14 00:58:02.942272 kernel: pci 0000:00:04.6: bridge window [mem 0x38b000000000-0x38b7ffffffff 64bit pref] Jan 14 00:58:02.942371 kernel: pci 0000:00:04.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 00:58:02.942481 kernel: pci 0000:00:04.7: BAR 0 [mem 0x84386000-0x84386fff] Jan 14 00:58:02.942574 kernel: pci 0000:00:04.7: PCI bridge to [bus 19] Jan 14 00:58:02.942658 kernel: pci 0000:00:04.7: bridge window [mem 0x81200000-0x813fffff] Jan 14 00:58:02.942742 kernel: pci 0000:00:04.7: bridge window [mem 0x38b800000000-0x38bfffffffff 64bit pref] Jan 14 00:58:02.942831 kernel: pci 0000:00:05.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 00:58:02.942918 kernel: pci 0000:00:05.0: BAR 0 [mem 0x84385000-0x84385fff] Jan 14 00:58:02.943002 kernel: pci 0000:00:05.0: PCI bridge to [bus 1a] Jan 14 00:58:02.943111 kernel: pci 0000:00:05.0: bridge window [mem 0x81000000-0x811fffff] Jan 14 00:58:02.943195 kernel: pci 0000:00:05.0: bridge window [mem 0x38c000000000-0x38c7ffffffff 64bit pref] Jan 14 00:58:02.943289 kernel: pci 0000:00:05.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 00:58:02.943376 kernel: pci 0000:00:05.1: BAR 0 [mem 0x84384000-0x84384fff] Jan 14 00:58:02.943465 kernel: pci 0000:00:05.1: PCI bridge to [bus 1b] Jan 14 00:58:02.943554 kernel: pci 0000:00:05.1: bridge window [mem 0x80e00000-0x80ffffff] Jan 14 00:58:02.943680 kernel: pci 0000:00:05.1: bridge window [mem 0x38c800000000-0x38cfffffffff 64bit pref] Jan 14 00:58:02.943795 kernel: pci 0000:00:05.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 00:58:02.943886 kernel: pci 0000:00:05.2: BAR 0 [mem 0x84383000-0x84383fff] Jan 14 00:58:02.943970 kernel: pci 0000:00:05.2: PCI bridge to [bus 1c] Jan 14 00:58:02.944072 kernel: pci 0000:00:05.2: bridge window [mem 0x80c00000-0x80dfffff] Jan 14 00:58:02.944162 kernel: pci 0000:00:05.2: bridge window [mem 0x38d000000000-0x38d7ffffffff 64bit pref] Jan 14 00:58:02.944261 kernel: pci 0000:00:05.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 00:58:02.944354 kernel: pci 0000:00:05.3: BAR 0 [mem 0x84382000-0x84382fff] Jan 14 00:58:02.944445 kernel: pci 0000:00:05.3: PCI bridge to [bus 1d] Jan 14 00:58:02.944538 kernel: pci 0000:00:05.3: bridge window [mem 0x80a00000-0x80bfffff] Jan 14 00:58:02.944632 kernel: pci 0000:00:05.3: bridge window [mem 0x38d800000000-0x38dfffffffff 64bit pref] Jan 14 00:58:02.944736 kernel: pci 0000:00:05.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 00:58:02.944828 kernel: pci 0000:00:05.4: BAR 0 [mem 0x84381000-0x84381fff] Jan 14 00:58:02.944920 kernel: pci 0000:00:05.4: PCI bridge to [bus 1e] Jan 14 00:58:02.945011 kernel: pci 0000:00:05.4: bridge window [mem 0x80800000-0x809fffff] Jan 14 00:58:02.945117 kernel: pci 0000:00:05.4: bridge window [mem 0x38e000000000-0x38e7ffffffff 64bit pref] Jan 14 00:58:02.945219 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 conventional PCI endpoint Jan 14 00:58:02.945311 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Jan 14 00:58:02.945411 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 conventional PCI endpoint Jan 14 00:58:02.945505 kernel: pci 0000:00:1f.2: BAR 4 [io 0x7040-0x705f] Jan 14 00:58:02.945596 kernel: pci 0000:00:1f.2: BAR 5 [mem 0x84380000-0x84380fff] Jan 14 00:58:02.945692 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 conventional PCI endpoint Jan 14 00:58:02.945786 kernel: pci 0000:00:1f.3: BAR 4 [io 0x7000-0x703f] Jan 14 00:58:02.949376 kernel: pci 0000:01:00.0: [1b36:000e] type 01 class 0x060400 PCIe to PCI/PCI-X bridge Jan 14 00:58:02.949537 kernel: pci 0000:01:00.0: BAR 0 [mem 0x84200000-0x842000ff 64bit] Jan 14 00:58:02.949639 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Jan 14 00:58:02.949738 kernel: pci 0000:01:00.0: bridge window [io 0x6000-0x6fff] Jan 14 00:58:02.949838 kernel: pci 0000:01:00.0: bridge window [mem 0x84000000-0x841fffff] Jan 14 00:58:02.949932 kernel: pci 0000:01:00.0: bridge window [mem 0x380000000000-0x3807ffffffff 64bit pref] Jan 14 00:58:02.950026 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Jan 14 00:58:02.950136 kernel: pci_bus 0000:02: extended config space not accessible Jan 14 00:58:02.950148 kernel: acpiphp: Slot [1] registered Jan 14 00:58:02.950156 kernel: acpiphp: Slot [0] registered Jan 14 00:58:02.950165 kernel: acpiphp: Slot [2] registered Jan 14 00:58:02.950176 kernel: acpiphp: Slot [3] registered Jan 14 00:58:02.950184 kernel: acpiphp: Slot [4] registered Jan 14 00:58:02.950193 kernel: acpiphp: Slot [5] registered Jan 14 00:58:02.950202 kernel: acpiphp: Slot [6] registered Jan 14 00:58:02.950210 kernel: acpiphp: Slot [7] registered Jan 14 00:58:02.950219 kernel: acpiphp: Slot [8] registered Jan 14 00:58:02.950230 kernel: acpiphp: Slot [9] registered Jan 14 00:58:02.950240 kernel: acpiphp: Slot [10] registered Jan 14 00:58:02.950248 kernel: acpiphp: Slot [11] registered Jan 14 00:58:02.950257 kernel: acpiphp: Slot [12] registered Jan 14 00:58:02.950265 kernel: acpiphp: Slot [13] registered Jan 14 00:58:02.950274 kernel: acpiphp: Slot [14] registered Jan 14 00:58:02.950282 kernel: acpiphp: Slot [15] registered Jan 14 00:58:02.950290 kernel: acpiphp: Slot [16] registered Jan 14 00:58:02.950299 kernel: acpiphp: Slot [17] registered Jan 14 00:58:02.950309 kernel: acpiphp: Slot [18] registered Jan 14 00:58:02.950318 kernel: acpiphp: Slot [19] registered Jan 14 00:58:02.950326 kernel: acpiphp: Slot [20] registered Jan 14 00:58:02.950334 kernel: acpiphp: Slot [21] registered Jan 14 00:58:02.950343 kernel: acpiphp: Slot [22] registered Jan 14 00:58:02.950352 kernel: acpiphp: Slot [23] registered Jan 14 00:58:02.950365 kernel: acpiphp: Slot [24] registered Jan 14 00:58:02.950378 kernel: acpiphp: Slot [25] registered Jan 14 00:58:02.950396 kernel: acpiphp: Slot [26] registered Jan 14 00:58:02.950428 kernel: acpiphp: Slot [27] registered Jan 14 00:58:02.950437 kernel: acpiphp: Slot [28] registered Jan 14 00:58:02.950445 kernel: acpiphp: Slot [29] registered Jan 14 00:58:02.950454 kernel: acpiphp: Slot [30] registered Jan 14 00:58:02.950462 kernel: acpiphp: Slot [31] registered Jan 14 00:58:02.950574 kernel: pci 0000:02:01.0: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint Jan 14 00:58:02.950673 kernel: pci 0000:02:01.0: BAR 4 [io 0x6000-0x601f] Jan 14 00:58:02.950769 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Jan 14 00:58:02.950779 kernel: acpiphp: Slot [0-2] registered Jan 14 00:58:02.950879 kernel: pci 0000:03:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Jan 14 00:58:02.950975 kernel: pci 0000:03:00.0: BAR 1 [mem 0x83e00000-0x83e00fff] Jan 14 00:58:02.951151 kernel: pci 0000:03:00.0: BAR 4 [mem 0x380800000000-0x380800003fff 64bit pref] Jan 14 00:58:02.951252 kernel: pci 0000:03:00.0: ROM [mem 0xfff80000-0xffffffff pref] Jan 14 00:58:02.951342 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Jan 14 00:58:02.951352 kernel: acpiphp: Slot [0-3] registered Jan 14 00:58:02.951448 kernel: pci 0000:04:00.0: [1af4:1042] type 00 class 0x010000 PCIe Endpoint Jan 14 00:58:02.951536 kernel: pci 0000:04:00.0: BAR 1 [mem 0x83c00000-0x83c00fff] Jan 14 00:58:02.951625 kernel: pci 0000:04:00.0: BAR 4 [mem 0x381000000000-0x381000003fff 64bit pref] Jan 14 00:58:02.951710 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Jan 14 00:58:02.951721 kernel: acpiphp: Slot [0-4] registered Jan 14 00:58:02.951810 kernel: pci 0000:05:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Jan 14 00:58:02.951901 kernel: pci 0000:05:00.0: BAR 4 [mem 0x381800000000-0x381800003fff 64bit pref] Jan 14 00:58:02.951990 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Jan 14 00:58:02.952003 kernel: acpiphp: Slot [0-5] registered Jan 14 00:58:02.952115 kernel: pci 0000:06:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Jan 14 00:58:02.952208 kernel: pci 0000:06:00.0: BAR 1 [mem 0x83800000-0x83800fff] Jan 14 00:58:02.952298 kernel: pci 0000:06:00.0: BAR 4 [mem 0x382000000000-0x382000003fff 64bit pref] Jan 14 00:58:02.952389 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Jan 14 00:58:02.952399 kernel: acpiphp: Slot [0-6] registered Jan 14 00:58:02.952492 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Jan 14 00:58:02.952503 kernel: acpiphp: Slot [0-7] registered Jan 14 00:58:02.952593 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Jan 14 00:58:02.952603 kernel: acpiphp: Slot [0-8] registered Jan 14 00:58:02.952693 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Jan 14 00:58:02.952703 kernel: acpiphp: Slot [0-9] registered Jan 14 00:58:02.952795 kernel: pci 0000:00:03.0: PCI bridge to [bus 0a] Jan 14 00:58:02.952807 kernel: acpiphp: Slot [0-10] registered Jan 14 00:58:02.952897 kernel: pci 0000:00:03.1: PCI bridge to [bus 0b] Jan 14 00:58:02.952907 kernel: acpiphp: Slot [0-11] registered Jan 14 00:58:02.952999 kernel: pci 0000:00:03.2: PCI bridge to [bus 0c] Jan 14 00:58:02.953010 kernel: acpiphp: Slot [0-12] registered Jan 14 00:58:02.953120 kernel: pci 0000:00:03.3: PCI bridge to [bus 0d] Jan 14 00:58:02.953133 kernel: acpiphp: Slot [0-13] registered Jan 14 00:58:02.953226 kernel: pci 0000:00:03.4: PCI bridge to [bus 0e] Jan 14 00:58:02.953236 kernel: acpiphp: Slot [0-14] registered Jan 14 00:58:02.953329 kernel: pci 0000:00:03.5: PCI bridge to [bus 0f] Jan 14 00:58:02.953340 kernel: acpiphp: Slot [0-15] registered Jan 14 00:58:02.953431 kernel: pci 0000:00:03.6: PCI bridge to [bus 10] Jan 14 00:58:02.953443 kernel: acpiphp: Slot [0-16] registered Jan 14 00:58:02.953537 kernel: pci 0000:00:03.7: PCI bridge to [bus 11] Jan 14 00:58:02.953548 kernel: acpiphp: Slot [0-17] registered Jan 14 00:58:02.953638 kernel: pci 0000:00:04.0: PCI bridge to [bus 12] Jan 14 00:58:02.953648 kernel: acpiphp: Slot [0-18] registered Jan 14 00:58:02.953739 kernel: pci 0000:00:04.1: PCI bridge to [bus 13] Jan 14 00:58:02.953750 kernel: acpiphp: Slot [0-19] registered Jan 14 00:58:02.953841 kernel: pci 0000:00:04.2: PCI bridge to [bus 14] Jan 14 00:58:02.953852 kernel: acpiphp: Slot [0-20] registered Jan 14 00:58:02.953944 kernel: pci 0000:00:04.3: PCI bridge to [bus 15] Jan 14 00:58:02.953956 kernel: acpiphp: Slot [0-21] registered Jan 14 00:58:02.954058 kernel: pci 0000:00:04.4: PCI bridge to [bus 16] Jan 14 00:58:02.954069 kernel: acpiphp: Slot [0-22] registered Jan 14 00:58:02.954166 kernel: pci 0000:00:04.5: PCI bridge to [bus 17] Jan 14 00:58:02.954176 kernel: acpiphp: Slot [0-23] registered Jan 14 00:58:02.954267 kernel: pci 0000:00:04.6: PCI bridge to [bus 18] Jan 14 00:58:02.954278 kernel: acpiphp: Slot [0-24] registered Jan 14 00:58:02.954372 kernel: pci 0000:00:04.7: PCI bridge to [bus 19] Jan 14 00:58:02.954383 kernel: acpiphp: Slot [0-25] registered Jan 14 00:58:02.954491 kernel: pci 0000:00:05.0: PCI bridge to [bus 1a] Jan 14 00:58:02.954502 kernel: acpiphp: Slot [0-26] registered Jan 14 00:58:02.954592 kernel: pci 0000:00:05.1: PCI bridge to [bus 1b] Jan 14 00:58:02.954602 kernel: acpiphp: Slot [0-27] registered Jan 14 00:58:02.954692 kernel: pci 0000:00:05.2: PCI bridge to [bus 1c] Jan 14 00:58:02.954702 kernel: acpiphp: Slot [0-28] registered Jan 14 00:58:02.954794 kernel: pci 0000:00:05.3: PCI bridge to [bus 1d] Jan 14 00:58:02.954807 kernel: acpiphp: Slot [0-29] registered Jan 14 00:58:02.954899 kernel: pci 0000:00:05.4: PCI bridge to [bus 1e] Jan 14 00:58:02.954910 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Jan 14 00:58:02.954919 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Jan 14 00:58:02.954928 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Jan 14 00:58:02.954936 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Jan 14 00:58:02.954947 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Jan 14 00:58:02.954955 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Jan 14 00:58:02.954964 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Jan 14 00:58:02.954972 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Jan 14 00:58:02.954981 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Jan 14 00:58:02.954989 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Jan 14 00:58:02.954998 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Jan 14 00:58:02.955006 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Jan 14 00:58:02.955016 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Jan 14 00:58:02.955024 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Jan 14 00:58:02.955032 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Jan 14 00:58:02.955040 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Jan 14 00:58:02.955055 kernel: iommu: Default domain type: Translated Jan 14 00:58:02.955064 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Jan 14 00:58:02.955072 kernel: efivars: Registered efivars operations Jan 14 00:58:02.955081 kernel: PCI: Using ACPI for IRQ routing Jan 14 00:58:02.955089 kernel: PCI: pci_cache_line_size set to 64 bytes Jan 14 00:58:02.955098 kernel: e820: reserve RAM buffer [mem 0x0080b000-0x008fffff] Jan 14 00:58:02.955106 kernel: e820: reserve RAM buffer [mem 0x00811000-0x008fffff] Jan 14 00:58:02.955114 kernel: e820: reserve RAM buffer [mem 0x7df57018-0x7fffffff] Jan 14 00:58:02.955122 kernel: e820: reserve RAM buffer [mem 0x7df7f018-0x7fffffff] Jan 14 00:58:02.955130 kernel: e820: reserve RAM buffer [mem 0x7e93f000-0x7fffffff] Jan 14 00:58:02.955140 kernel: e820: reserve RAM buffer [mem 0x7ec71000-0x7fffffff] Jan 14 00:58:02.955149 kernel: e820: reserve RAM buffer [mem 0x7f8ed000-0x7fffffff] Jan 14 00:58:02.955157 kernel: e820: reserve RAM buffer [mem 0x7feaf000-0x7fffffff] Jan 14 00:58:02.955165 kernel: e820: reserve RAM buffer [mem 0x7feec000-0x7fffffff] Jan 14 00:58:02.955260 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Jan 14 00:58:02.955352 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Jan 14 00:58:02.955447 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Jan 14 00:58:02.955458 kernel: vgaarb: loaded Jan 14 00:58:02.955466 kernel: clocksource: Switched to clocksource kvm-clock Jan 14 00:58:02.955474 kernel: VFS: Disk quotas dquot_6.6.0 Jan 14 00:58:02.955483 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Jan 14 00:58:02.955491 kernel: pnp: PnP ACPI init Jan 14 00:58:02.955594 kernel: system 00:04: [mem 0xe0000000-0xefffffff window] has been reserved Jan 14 00:58:02.955607 kernel: pnp: PnP ACPI: found 5 devices Jan 14 00:58:02.955615 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Jan 14 00:58:02.955624 kernel: NET: Registered PF_INET protocol family Jan 14 00:58:02.955632 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Jan 14 00:58:02.955640 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Jan 14 00:58:02.955649 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Jan 14 00:58:02.955657 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Jan 14 00:58:02.955668 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Jan 14 00:58:02.955676 kernel: TCP: Hash tables configured (established 32768 bind 32768) Jan 14 00:58:02.955685 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Jan 14 00:58:02.955693 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Jan 14 00:58:02.955702 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Jan 14 00:58:02.955710 kernel: NET: Registered PF_XDP protocol family Jan 14 00:58:02.955807 kernel: pci 0000:03:00.0: ROM [mem 0xfff80000-0xffffffff pref]: can't claim; no compatible bridge window Jan 14 00:58:02.955902 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Jan 14 00:58:02.956009 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Jan 14 00:58:02.956136 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Jan 14 00:58:02.956230 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Jan 14 00:58:02.956325 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Jan 14 00:58:02.956421 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Jan 14 00:58:02.956518 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Jan 14 00:58:02.956611 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 Jan 14 00:58:02.956704 kernel: pci 0000:00:03.1: bridge window [io 0x1000-0x0fff] to [bus 0b] add_size 1000 Jan 14 00:58:02.956798 kernel: pci 0000:00:03.2: bridge window [io 0x1000-0x0fff] to [bus 0c] add_size 1000 Jan 14 00:58:02.956891 kernel: pci 0000:00:03.3: bridge window [io 0x1000-0x0fff] to [bus 0d] add_size 1000 Jan 14 00:58:02.956986 kernel: pci 0000:00:03.4: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 Jan 14 00:58:02.957090 kernel: pci 0000:00:03.5: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 Jan 14 00:58:02.957185 kernel: pci 0000:00:03.6: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 Jan 14 00:58:02.957277 kernel: pci 0000:00:03.7: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 Jan 14 00:58:02.957367 kernel: pci 0000:00:04.0: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 Jan 14 00:58:02.957457 kernel: pci 0000:00:04.1: bridge window [io 0x1000-0x0fff] to [bus 13] add_size 1000 Jan 14 00:58:02.957547 kernel: pci 0000:00:04.2: bridge window [io 0x1000-0x0fff] to [bus 14] add_size 1000 Jan 14 00:58:02.957638 kernel: pci 0000:00:04.3: bridge window [io 0x1000-0x0fff] to [bus 15] add_size 1000 Jan 14 00:58:02.957732 kernel: pci 0000:00:04.4: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 Jan 14 00:58:02.957823 kernel: pci 0000:00:04.5: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 Jan 14 00:58:02.957915 kernel: pci 0000:00:04.6: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 Jan 14 00:58:02.958008 kernel: pci 0000:00:04.7: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 Jan 14 00:58:02.958138 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 Jan 14 00:58:02.958324 kernel: pci 0000:00:05.1: bridge window [io 0x1000-0x0fff] to [bus 1b] add_size 1000 Jan 14 00:58:02.958474 kernel: pci 0000:00:05.2: bridge window [io 0x1000-0x0fff] to [bus 1c] add_size 1000 Jan 14 00:58:02.958611 kernel: pci 0000:00:05.3: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 Jan 14 00:58:02.958714 kernel: pci 0000:00:05.4: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 Jan 14 00:58:02.958808 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x1fff]: assigned Jan 14 00:58:02.958899 kernel: pci 0000:00:02.2: bridge window [io 0x2000-0x2fff]: assigned Jan 14 00:58:02.958988 kernel: pci 0000:00:02.3: bridge window [io 0x3000-0x3fff]: assigned Jan 14 00:58:02.959080 kernel: pci 0000:00:02.4: bridge window [io 0x4000-0x4fff]: assigned Jan 14 00:58:02.959177 kernel: pci 0000:00:02.5: bridge window [io 0x5000-0x5fff]: assigned Jan 14 00:58:02.959261 kernel: pci 0000:00:02.6: bridge window [io 0x8000-0x8fff]: assigned Jan 14 00:58:02.959345 kernel: pci 0000:00:02.7: bridge window [io 0x9000-0x9fff]: assigned Jan 14 00:58:02.959433 kernel: pci 0000:00:03.0: bridge window [io 0xa000-0xafff]: assigned Jan 14 00:58:02.959522 kernel: pci 0000:00:03.1: bridge window [io 0xb000-0xbfff]: assigned Jan 14 00:58:02.959610 kernel: pci 0000:00:03.2: bridge window [io 0xc000-0xcfff]: assigned Jan 14 00:58:02.959700 kernel: pci 0000:00:03.3: bridge window [io 0xd000-0xdfff]: assigned Jan 14 00:58:02.959789 kernel: pci 0000:00:03.4: bridge window [io 0xe000-0xefff]: assigned Jan 14 00:58:02.959877 kernel: pci 0000:00:03.5: bridge window [io 0xf000-0xffff]: assigned Jan 14 00:58:02.959966 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: can't assign; no space Jan 14 00:58:02.960063 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: failed to assign Jan 14 00:58:02.960152 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: can't assign; no space Jan 14 00:58:02.960242 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: failed to assign Jan 14 00:58:02.960335 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: can't assign; no space Jan 14 00:58:02.960424 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: failed to assign Jan 14 00:58:02.960515 kernel: pci 0000:00:04.1: bridge window [io size 0x1000]: can't assign; no space Jan 14 00:58:02.960612 kernel: pci 0000:00:04.1: bridge window [io size 0x1000]: failed to assign Jan 14 00:58:02.960702 kernel: pci 0000:00:04.2: bridge window [io size 0x1000]: can't assign; no space Jan 14 00:58:02.960796 kernel: pci 0000:00:04.2: bridge window [io size 0x1000]: failed to assign Jan 14 00:58:02.960890 kernel: pci 0000:00:04.3: bridge window [io size 0x1000]: can't assign; no space Jan 14 00:58:02.960978 kernel: pci 0000:00:04.3: bridge window [io size 0x1000]: failed to assign Jan 14 00:58:02.961075 kernel: pci 0000:00:04.4: bridge window [io size 0x1000]: can't assign; no space Jan 14 00:58:02.961168 kernel: pci 0000:00:04.4: bridge window [io size 0x1000]: failed to assign Jan 14 00:58:02.961263 kernel: pci 0000:00:04.5: bridge window [io size 0x1000]: can't assign; no space Jan 14 00:58:02.961359 kernel: pci 0000:00:04.5: bridge window [io size 0x1000]: failed to assign Jan 14 00:58:02.961476 kernel: pci 0000:00:04.6: bridge window [io size 0x1000]: can't assign; no space Jan 14 00:58:02.961568 kernel: pci 0000:00:04.6: bridge window [io size 0x1000]: failed to assign Jan 14 00:58:02.961662 kernel: pci 0000:00:04.7: bridge window [io size 0x1000]: can't assign; no space Jan 14 00:58:02.961757 kernel: pci 0000:00:04.7: bridge window [io size 0x1000]: failed to assign Jan 14 00:58:02.961851 kernel: pci 0000:00:05.0: bridge window [io size 0x1000]: can't assign; no space Jan 14 00:58:02.961945 kernel: pci 0000:00:05.0: bridge window [io size 0x1000]: failed to assign Jan 14 00:58:02.962038 kernel: pci 0000:00:05.1: bridge window [io size 0x1000]: can't assign; no space Jan 14 00:58:02.962142 kernel: pci 0000:00:05.1: bridge window [io size 0x1000]: failed to assign Jan 14 00:58:02.962245 kernel: pci 0000:00:05.2: bridge window [io size 0x1000]: can't assign; no space Jan 14 00:58:02.962344 kernel: pci 0000:00:05.2: bridge window [io size 0x1000]: failed to assign Jan 14 00:58:02.962449 kernel: pci 0000:00:05.3: bridge window [io size 0x1000]: can't assign; no space Jan 14 00:58:02.962548 kernel: pci 0000:00:05.3: bridge window [io size 0x1000]: failed to assign Jan 14 00:58:02.962641 kernel: pci 0000:00:05.4: bridge window [io size 0x1000]: can't assign; no space Jan 14 00:58:02.962736 kernel: pci 0000:00:05.4: bridge window [io size 0x1000]: failed to assign Jan 14 00:58:02.962830 kernel: pci 0000:00:05.4: bridge window [io 0x1000-0x1fff]: assigned Jan 14 00:58:02.962941 kernel: pci 0000:00:05.3: bridge window [io 0x2000-0x2fff]: assigned Jan 14 00:58:02.963038 kernel: pci 0000:00:05.2: bridge window [io 0x3000-0x3fff]: assigned Jan 14 00:58:02.963139 kernel: pci 0000:00:05.1: bridge window [io 0x4000-0x4fff]: assigned Jan 14 00:58:02.963228 kernel: pci 0000:00:05.0: bridge window [io 0x5000-0x5fff]: assigned Jan 14 00:58:02.963321 kernel: pci 0000:00:04.7: bridge window [io 0x8000-0x8fff]: assigned Jan 14 00:58:02.963417 kernel: pci 0000:00:04.6: bridge window [io 0x9000-0x9fff]: assigned Jan 14 00:58:02.963512 kernel: pci 0000:00:04.5: bridge window [io 0xa000-0xafff]: assigned Jan 14 00:58:02.963607 kernel: pci 0000:00:04.4: bridge window [io 0xb000-0xbfff]: assigned Jan 14 00:58:02.963701 kernel: pci 0000:00:04.3: bridge window [io 0xc000-0xcfff]: assigned Jan 14 00:58:02.963796 kernel: pci 0000:00:04.2: bridge window [io 0xd000-0xdfff]: assigned Jan 14 00:58:02.963891 kernel: pci 0000:00:04.1: bridge window [io 0xe000-0xefff]: assigned Jan 14 00:58:02.963987 kernel: pci 0000:00:04.0: bridge window [io 0xf000-0xffff]: assigned Jan 14 00:58:02.964093 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: can't assign; no space Jan 14 00:58:02.964188 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: failed to assign Jan 14 00:58:02.964282 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: can't assign; no space Jan 14 00:58:02.964378 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: failed to assign Jan 14 00:58:02.964468 kernel: pci 0000:00:03.5: bridge window [io size 0x1000]: can't assign; no space Jan 14 00:58:02.964559 kernel: pci 0000:00:03.5: bridge window [io size 0x1000]: failed to assign Jan 14 00:58:02.964653 kernel: pci 0000:00:03.4: bridge window [io size 0x1000]: can't assign; no space Jan 14 00:58:02.964746 kernel: pci 0000:00:03.4: bridge window [io size 0x1000]: failed to assign Jan 14 00:58:02.964840 kernel: pci 0000:00:03.3: bridge window [io size 0x1000]: can't assign; no space Jan 14 00:58:02.964934 kernel: pci 0000:00:03.3: bridge window [io size 0x1000]: failed to assign Jan 14 00:58:02.965028 kernel: pci 0000:00:03.2: bridge window [io size 0x1000]: can't assign; no space Jan 14 00:58:02.965145 kernel: pci 0000:00:03.2: bridge window [io size 0x1000]: failed to assign Jan 14 00:58:02.965241 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: can't assign; no space Jan 14 00:58:02.965347 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: failed to assign Jan 14 00:58:02.965461 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: can't assign; no space Jan 14 00:58:02.965587 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: failed to assign Jan 14 00:58:02.965681 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: can't assign; no space Jan 14 00:58:02.965775 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: failed to assign Jan 14 00:58:02.965867 kernel: pci 0000:00:02.6: bridge window [io size 0x1000]: can't assign; no space Jan 14 00:58:02.965963 kernel: pci 0000:00:02.6: bridge window [io size 0x1000]: failed to assign Jan 14 00:58:02.966067 kernel: pci 0000:00:02.5: bridge window [io size 0x1000]: can't assign; no space Jan 14 00:58:02.966163 kernel: pci 0000:00:02.5: bridge window [io size 0x1000]: failed to assign Jan 14 00:58:02.966257 kernel: pci 0000:00:02.4: bridge window [io size 0x1000]: can't assign; no space Jan 14 00:58:02.966351 kernel: pci 0000:00:02.4: bridge window [io size 0x1000]: failed to assign Jan 14 00:58:02.966455 kernel: pci 0000:00:02.3: bridge window [io size 0x1000]: can't assign; no space Jan 14 00:58:02.966568 kernel: pci 0000:00:02.3: bridge window [io size 0x1000]: failed to assign Jan 14 00:58:02.966659 kernel: pci 0000:00:02.2: bridge window [io size 0x1000]: can't assign; no space Jan 14 00:58:02.966747 kernel: pci 0000:00:02.2: bridge window [io size 0x1000]: failed to assign Jan 14 00:58:02.966836 kernel: pci 0000:00:02.1: bridge window [io size 0x1000]: can't assign; no space Jan 14 00:58:02.966922 kernel: pci 0000:00:02.1: bridge window [io size 0x1000]: failed to assign Jan 14 00:58:02.967013 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Jan 14 00:58:02.967106 kernel: pci 0000:01:00.0: bridge window [io 0x6000-0x6fff] Jan 14 00:58:02.967194 kernel: pci 0000:01:00.0: bridge window [mem 0x84000000-0x841fffff] Jan 14 00:58:02.967277 kernel: pci 0000:01:00.0: bridge window [mem 0x380000000000-0x3807ffffffff 64bit pref] Jan 14 00:58:02.967359 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Jan 14 00:58:02.967440 kernel: pci 0000:00:02.0: bridge window [io 0x6000-0x6fff] Jan 14 00:58:02.967520 kernel: pci 0000:00:02.0: bridge window [mem 0x84000000-0x842fffff] Jan 14 00:58:02.967600 kernel: pci 0000:00:02.0: bridge window [mem 0x380000000000-0x3807ffffffff 64bit pref] Jan 14 00:58:02.967688 kernel: pci 0000:03:00.0: ROM [mem 0x83e80000-0x83efffff pref]: assigned Jan 14 00:58:02.967773 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Jan 14 00:58:02.967856 kernel: pci 0000:00:02.1: bridge window [mem 0x83e00000-0x83ffffff] Jan 14 00:58:02.967938 kernel: pci 0000:00:02.1: bridge window [mem 0x380800000000-0x380fffffffff 64bit pref] Jan 14 00:58:02.968021 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Jan 14 00:58:02.968114 kernel: pci 0000:00:02.2: bridge window [mem 0x83c00000-0x83dfffff] Jan 14 00:58:02.969040 kernel: pci 0000:00:02.2: bridge window [mem 0x381000000000-0x3817ffffffff 64bit pref] Jan 14 00:58:02.969161 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Jan 14 00:58:02.969254 kernel: pci 0000:00:02.3: bridge window [mem 0x83a00000-0x83bfffff] Jan 14 00:58:02.969339 kernel: pci 0000:00:02.3: bridge window [mem 0x381800000000-0x381fffffffff 64bit pref] Jan 14 00:58:02.969424 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Jan 14 00:58:02.969510 kernel: pci 0000:00:02.4: bridge window [mem 0x83800000-0x839fffff] Jan 14 00:58:02.969597 kernel: pci 0000:00:02.4: bridge window [mem 0x382000000000-0x3827ffffffff 64bit pref] Jan 14 00:58:02.969686 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Jan 14 00:58:02.969775 kernel: pci 0000:00:02.5: bridge window [mem 0x83600000-0x837fffff] Jan 14 00:58:02.969864 kernel: pci 0000:00:02.5: bridge window [mem 0x382800000000-0x382fffffffff 64bit pref] Jan 14 00:58:02.969960 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Jan 14 00:58:02.970060 kernel: pci 0000:00:02.6: bridge window [mem 0x83400000-0x835fffff] Jan 14 00:58:02.970153 kernel: pci 0000:00:02.6: bridge window [mem 0x383000000000-0x3837ffffffff 64bit pref] Jan 14 00:58:02.970247 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Jan 14 00:58:02.970341 kernel: pci 0000:00:02.7: bridge window [mem 0x83200000-0x833fffff] Jan 14 00:58:02.970449 kernel: pci 0000:00:02.7: bridge window [mem 0x383800000000-0x383fffffffff 64bit pref] Jan 14 00:58:02.970545 kernel: pci 0000:00:03.0: PCI bridge to [bus 0a] Jan 14 00:58:02.970637 kernel: pci 0000:00:03.0: bridge window [mem 0x83000000-0x831fffff] Jan 14 00:58:02.970729 kernel: pci 0000:00:03.0: bridge window [mem 0x384000000000-0x3847ffffffff 64bit pref] Jan 14 00:58:02.970822 kernel: pci 0000:00:03.1: PCI bridge to [bus 0b] Jan 14 00:58:02.970913 kernel: pci 0000:00:03.1: bridge window [mem 0x82e00000-0x82ffffff] Jan 14 00:58:02.971005 kernel: pci 0000:00:03.1: bridge window [mem 0x384800000000-0x384fffffffff 64bit pref] Jan 14 00:58:02.971105 kernel: pci 0000:00:03.2: PCI bridge to [bus 0c] Jan 14 00:58:02.971199 kernel: pci 0000:00:03.2: bridge window [mem 0x82c00000-0x82dfffff] Jan 14 00:58:02.971289 kernel: pci 0000:00:03.2: bridge window [mem 0x385000000000-0x3857ffffffff 64bit pref] Jan 14 00:58:02.971380 kernel: pci 0000:00:03.3: PCI bridge to [bus 0d] Jan 14 00:58:02.971469 kernel: pci 0000:00:03.3: bridge window [mem 0x82a00000-0x82bfffff] Jan 14 00:58:02.971557 kernel: pci 0000:00:03.3: bridge window [mem 0x385800000000-0x385fffffffff 64bit pref] Jan 14 00:58:02.971645 kernel: pci 0000:00:03.4: PCI bridge to [bus 0e] Jan 14 00:58:02.971730 kernel: pci 0000:00:03.4: bridge window [mem 0x82800000-0x829fffff] Jan 14 00:58:02.971817 kernel: pci 0000:00:03.4: bridge window [mem 0x386000000000-0x3867ffffffff 64bit pref] Jan 14 00:58:02.971902 kernel: pci 0000:00:03.5: PCI bridge to [bus 0f] Jan 14 00:58:02.971990 kernel: pci 0000:00:03.5: bridge window [mem 0x82600000-0x827fffff] Jan 14 00:58:02.972128 kernel: pci 0000:00:03.5: bridge window [mem 0x386800000000-0x386fffffffff 64bit pref] Jan 14 00:58:02.972218 kernel: pci 0000:00:03.6: PCI bridge to [bus 10] Jan 14 00:58:02.972306 kernel: pci 0000:00:03.6: bridge window [mem 0x82400000-0x825fffff] Jan 14 00:58:02.972395 kernel: pci 0000:00:03.6: bridge window [mem 0x387000000000-0x3877ffffffff 64bit pref] Jan 14 00:58:02.972490 kernel: pci 0000:00:03.7: PCI bridge to [bus 11] Jan 14 00:58:02.972579 kernel: pci 0000:00:03.7: bridge window [mem 0x82200000-0x823fffff] Jan 14 00:58:02.972668 kernel: pci 0000:00:03.7: bridge window [mem 0x387800000000-0x387fffffffff 64bit pref] Jan 14 00:58:02.972761 kernel: pci 0000:00:04.0: PCI bridge to [bus 12] Jan 14 00:58:02.972852 kernel: pci 0000:00:04.0: bridge window [io 0xf000-0xffff] Jan 14 00:58:02.972943 kernel: pci 0000:00:04.0: bridge window [mem 0x82000000-0x821fffff] Jan 14 00:58:02.973037 kernel: pci 0000:00:04.0: bridge window [mem 0x388000000000-0x3887ffffffff 64bit pref] Jan 14 00:58:02.973803 kernel: pci 0000:00:04.1: PCI bridge to [bus 13] Jan 14 00:58:02.973900 kernel: pci 0000:00:04.1: bridge window [io 0xe000-0xefff] Jan 14 00:58:02.973993 kernel: pci 0000:00:04.1: bridge window [mem 0x81e00000-0x81ffffff] Jan 14 00:58:02.974096 kernel: pci 0000:00:04.1: bridge window [mem 0x388800000000-0x388fffffffff 64bit pref] Jan 14 00:58:02.974191 kernel: pci 0000:00:04.2: PCI bridge to [bus 14] Jan 14 00:58:02.974287 kernel: pci 0000:00:04.2: bridge window [io 0xd000-0xdfff] Jan 14 00:58:02.974380 kernel: pci 0000:00:04.2: bridge window [mem 0x81c00000-0x81dfffff] Jan 14 00:58:02.974485 kernel: pci 0000:00:04.2: bridge window [mem 0x389000000000-0x3897ffffffff 64bit pref] Jan 14 00:58:02.974579 kernel: pci 0000:00:04.3: PCI bridge to [bus 15] Jan 14 00:58:02.974669 kernel: pci 0000:00:04.3: bridge window [io 0xc000-0xcfff] Jan 14 00:58:02.974758 kernel: pci 0000:00:04.3: bridge window [mem 0x81a00000-0x81bfffff] Jan 14 00:58:02.974842 kernel: pci 0000:00:04.3: bridge window [mem 0x389800000000-0x389fffffffff 64bit pref] Jan 14 00:58:02.974929 kernel: pci 0000:00:04.4: PCI bridge to [bus 16] Jan 14 00:58:02.975013 kernel: pci 0000:00:04.4: bridge window [io 0xb000-0xbfff] Jan 14 00:58:02.975106 kernel: pci 0000:00:04.4: bridge window [mem 0x81800000-0x819fffff] Jan 14 00:58:02.975187 kernel: pci 0000:00:04.4: bridge window [mem 0x38a000000000-0x38a7ffffffff 64bit pref] Jan 14 00:58:02.975270 kernel: pci 0000:00:04.5: PCI bridge to [bus 17] Jan 14 00:58:02.975351 kernel: pci 0000:00:04.5: bridge window [io 0xa000-0xafff] Jan 14 00:58:02.975434 kernel: pci 0000:00:04.5: bridge window [mem 0x81600000-0x817fffff] Jan 14 00:58:02.975514 kernel: pci 0000:00:04.5: bridge window [mem 0x38a800000000-0x38afffffffff 64bit pref] Jan 14 00:58:02.975602 kernel: pci 0000:00:04.6: PCI bridge to [bus 18] Jan 14 00:58:02.975684 kernel: pci 0000:00:04.6: bridge window [io 0x9000-0x9fff] Jan 14 00:58:02.975766 kernel: pci 0000:00:04.6: bridge window [mem 0x81400000-0x815fffff] Jan 14 00:58:02.975849 kernel: pci 0000:00:04.6: bridge window [mem 0x38b000000000-0x38b7ffffffff 64bit pref] Jan 14 00:58:02.975935 kernel: pci 0000:00:04.7: PCI bridge to [bus 19] Jan 14 00:58:02.976018 kernel: pci 0000:00:04.7: bridge window [io 0x8000-0x8fff] Jan 14 00:58:02.977198 kernel: pci 0000:00:04.7: bridge window [mem 0x81200000-0x813fffff] Jan 14 00:58:02.977316 kernel: pci 0000:00:04.7: bridge window [mem 0x38b800000000-0x38bfffffffff 64bit pref] Jan 14 00:58:02.977415 kernel: pci 0000:00:05.0: PCI bridge to [bus 1a] Jan 14 00:58:02.977509 kernel: pci 0000:00:05.0: bridge window [io 0x5000-0x5fff] Jan 14 00:58:02.977602 kernel: pci 0000:00:05.0: bridge window [mem 0x81000000-0x811fffff] Jan 14 00:58:02.977691 kernel: pci 0000:00:05.0: bridge window [mem 0x38c000000000-0x38c7ffffffff 64bit pref] Jan 14 00:58:02.978148 kernel: pci 0000:00:05.1: PCI bridge to [bus 1b] Jan 14 00:58:02.978250 kernel: pci 0000:00:05.1: bridge window [io 0x4000-0x4fff] Jan 14 00:58:02.978343 kernel: pci 0000:00:05.1: bridge window [mem 0x80e00000-0x80ffffff] Jan 14 00:58:02.978450 kernel: pci 0000:00:05.1: bridge window [mem 0x38c800000000-0x38cfffffffff 64bit pref] Jan 14 00:58:02.978551 kernel: pci 0000:00:05.2: PCI bridge to [bus 1c] Jan 14 00:58:02.978636 kernel: pci 0000:00:05.2: bridge window [io 0x3000-0x3fff] Jan 14 00:58:02.978722 kernel: pci 0000:00:05.2: bridge window [mem 0x80c00000-0x80dfffff] Jan 14 00:58:02.978812 kernel: pci 0000:00:05.2: bridge window [mem 0x38d000000000-0x38d7ffffffff 64bit pref] Jan 14 00:58:02.978903 kernel: pci 0000:00:05.3: PCI bridge to [bus 1d] Jan 14 00:58:02.978992 kernel: pci 0000:00:05.3: bridge window [io 0x2000-0x2fff] Jan 14 00:58:02.979200 kernel: pci 0000:00:05.3: bridge window [mem 0x80a00000-0x80bfffff] Jan 14 00:58:02.979287 kernel: pci 0000:00:05.3: bridge window [mem 0x38d800000000-0x38dfffffffff 64bit pref] Jan 14 00:58:02.979375 kernel: pci 0000:00:05.4: PCI bridge to [bus 1e] Jan 14 00:58:02.979459 kernel: pci 0000:00:05.4: bridge window [io 0x1000-0x1fff] Jan 14 00:58:02.979543 kernel: pci 0000:00:05.4: bridge window [mem 0x80800000-0x809fffff] Jan 14 00:58:02.979627 kernel: pci 0000:00:05.4: bridge window [mem 0x38e000000000-0x38e7ffffffff 64bit pref] Jan 14 00:58:02.979720 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Jan 14 00:58:02.979803 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Jan 14 00:58:02.979880 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Jan 14 00:58:02.979956 kernel: pci_bus 0000:00: resource 7 [mem 0x80000000-0xdfffffff window] Jan 14 00:58:02.980031 kernel: pci_bus 0000:00: resource 8 [mem 0xf0000000-0xfebfffff window] Jan 14 00:58:02.980241 kernel: pci_bus 0000:00: resource 9 [mem 0x380000000000-0x38e800003fff window] Jan 14 00:58:02.980337 kernel: pci_bus 0000:01: resource 0 [io 0x6000-0x6fff] Jan 14 00:58:02.980423 kernel: pci_bus 0000:01: resource 1 [mem 0x84000000-0x842fffff] Jan 14 00:58:02.980507 kernel: pci_bus 0000:01: resource 2 [mem 0x380000000000-0x3807ffffffff 64bit pref] Jan 14 00:58:02.980597 kernel: pci_bus 0000:02: resource 0 [io 0x6000-0x6fff] Jan 14 00:58:02.980684 kernel: pci_bus 0000:02: resource 1 [mem 0x84000000-0x841fffff] Jan 14 00:58:02.980770 kernel: pci_bus 0000:02: resource 2 [mem 0x380000000000-0x3807ffffffff 64bit pref] Jan 14 00:58:02.980862 kernel: pci_bus 0000:03: resource 1 [mem 0x83e00000-0x83ffffff] Jan 14 00:58:02.980945 kernel: pci_bus 0000:03: resource 2 [mem 0x380800000000-0x380fffffffff 64bit pref] Jan 14 00:58:02.981037 kernel: pci_bus 0000:04: resource 1 [mem 0x83c00000-0x83dfffff] Jan 14 00:58:02.981147 kernel: pci_bus 0000:04: resource 2 [mem 0x381000000000-0x3817ffffffff 64bit pref] Jan 14 00:58:02.981235 kernel: pci_bus 0000:05: resource 1 [mem 0x83a00000-0x83bfffff] Jan 14 00:58:02.981322 kernel: pci_bus 0000:05: resource 2 [mem 0x381800000000-0x381fffffffff 64bit pref] Jan 14 00:58:02.981412 kernel: pci_bus 0000:06: resource 1 [mem 0x83800000-0x839fffff] Jan 14 00:58:02.981540 kernel: pci_bus 0000:06: resource 2 [mem 0x382000000000-0x3827ffffffff 64bit pref] Jan 14 00:58:02.981632 kernel: pci_bus 0000:07: resource 1 [mem 0x83600000-0x837fffff] Jan 14 00:58:02.981716 kernel: pci_bus 0000:07: resource 2 [mem 0x382800000000-0x382fffffffff 64bit pref] Jan 14 00:58:02.981813 kernel: pci_bus 0000:08: resource 1 [mem 0x83400000-0x835fffff] Jan 14 00:58:02.981901 kernel: pci_bus 0000:08: resource 2 [mem 0x383000000000-0x3837ffffffff 64bit pref] Jan 14 00:58:02.981995 kernel: pci_bus 0000:09: resource 1 [mem 0x83200000-0x833fffff] Jan 14 00:58:02.985201 kernel: pci_bus 0000:09: resource 2 [mem 0x383800000000-0x383fffffffff 64bit pref] Jan 14 00:58:02.985317 kernel: pci_bus 0000:0a: resource 1 [mem 0x83000000-0x831fffff] Jan 14 00:58:02.985404 kernel: pci_bus 0000:0a: resource 2 [mem 0x384000000000-0x3847ffffffff 64bit pref] Jan 14 00:58:02.985500 kernel: pci_bus 0000:0b: resource 1 [mem 0x82e00000-0x82ffffff] Jan 14 00:58:02.985585 kernel: pci_bus 0000:0b: resource 2 [mem 0x384800000000-0x384fffffffff 64bit pref] Jan 14 00:58:02.985676 kernel: pci_bus 0000:0c: resource 1 [mem 0x82c00000-0x82dfffff] Jan 14 00:58:02.985762 kernel: pci_bus 0000:0c: resource 2 [mem 0x385000000000-0x3857ffffffff 64bit pref] Jan 14 00:58:02.985863 kernel: pci_bus 0000:0d: resource 1 [mem 0x82a00000-0x82bfffff] Jan 14 00:58:02.985949 kernel: pci_bus 0000:0d: resource 2 [mem 0x385800000000-0x385fffffffff 64bit pref] Jan 14 00:58:02.986043 kernel: pci_bus 0000:0e: resource 1 [mem 0x82800000-0x829fffff] Jan 14 00:58:02.986150 kernel: pci_bus 0000:0e: resource 2 [mem 0x386000000000-0x3867ffffffff 64bit pref] Jan 14 00:58:02.986245 kernel: pci_bus 0000:0f: resource 1 [mem 0x82600000-0x827fffff] Jan 14 00:58:02.986338 kernel: pci_bus 0000:0f: resource 2 [mem 0x386800000000-0x386fffffffff 64bit pref] Jan 14 00:58:02.986450 kernel: pci_bus 0000:10: resource 1 [mem 0x82400000-0x825fffff] Jan 14 00:58:02.986539 kernel: pci_bus 0000:10: resource 2 [mem 0x387000000000-0x3877ffffffff 64bit pref] Jan 14 00:58:02.986630 kernel: pci_bus 0000:11: resource 1 [mem 0x82200000-0x823fffff] Jan 14 00:58:02.986714 kernel: pci_bus 0000:11: resource 2 [mem 0x387800000000-0x387fffffffff 64bit pref] Jan 14 00:58:02.986807 kernel: pci_bus 0000:12: resource 0 [io 0xf000-0xffff] Jan 14 00:58:02.986894 kernel: pci_bus 0000:12: resource 1 [mem 0x82000000-0x821fffff] Jan 14 00:58:02.986979 kernel: pci_bus 0000:12: resource 2 [mem 0x388000000000-0x3887ffffffff 64bit pref] Jan 14 00:58:02.987081 kernel: pci_bus 0000:13: resource 0 [io 0xe000-0xefff] Jan 14 00:58:02.987166 kernel: pci_bus 0000:13: resource 1 [mem 0x81e00000-0x81ffffff] Jan 14 00:58:02.987250 kernel: pci_bus 0000:13: resource 2 [mem 0x388800000000-0x388fffffffff 64bit pref] Jan 14 00:58:02.987342 kernel: pci_bus 0000:14: resource 0 [io 0xd000-0xdfff] Jan 14 00:58:02.987427 kernel: pci_bus 0000:14: resource 1 [mem 0x81c00000-0x81dfffff] Jan 14 00:58:02.987579 kernel: pci_bus 0000:14: resource 2 [mem 0x389000000000-0x3897ffffffff 64bit pref] Jan 14 00:58:02.987670 kernel: pci_bus 0000:15: resource 0 [io 0xc000-0xcfff] Jan 14 00:58:02.987755 kernel: pci_bus 0000:15: resource 1 [mem 0x81a00000-0x81bfffff] Jan 14 00:58:02.987839 kernel: pci_bus 0000:15: resource 2 [mem 0x389800000000-0x389fffffffff 64bit pref] Jan 14 00:58:02.987935 kernel: pci_bus 0000:16: resource 0 [io 0xb000-0xbfff] Jan 14 00:58:02.988022 kernel: pci_bus 0000:16: resource 1 [mem 0x81800000-0x819fffff] Jan 14 00:58:02.988124 kernel: pci_bus 0000:16: resource 2 [mem 0x38a000000000-0x38a7ffffffff 64bit pref] Jan 14 00:58:02.988216 kernel: pci_bus 0000:17: resource 0 [io 0xa000-0xafff] Jan 14 00:58:02.988303 kernel: pci_bus 0000:17: resource 1 [mem 0x81600000-0x817fffff] Jan 14 00:58:02.988391 kernel: pci_bus 0000:17: resource 2 [mem 0x38a800000000-0x38afffffffff 64bit pref] Jan 14 00:58:02.988483 kernel: pci_bus 0000:18: resource 0 [io 0x9000-0x9fff] Jan 14 00:58:02.988569 kernel: pci_bus 0000:18: resource 1 [mem 0x81400000-0x815fffff] Jan 14 00:58:02.988655 kernel: pci_bus 0000:18: resource 2 [mem 0x38b000000000-0x38b7ffffffff 64bit pref] Jan 14 00:58:02.988748 kernel: pci_bus 0000:19: resource 0 [io 0x8000-0x8fff] Jan 14 00:58:02.988834 kernel: pci_bus 0000:19: resource 1 [mem 0x81200000-0x813fffff] Jan 14 00:58:02.988923 kernel: pci_bus 0000:19: resource 2 [mem 0x38b800000000-0x38bfffffffff 64bit pref] Jan 14 00:58:02.989016 kernel: pci_bus 0000:1a: resource 0 [io 0x5000-0x5fff] Jan 14 00:58:02.989113 kernel: pci_bus 0000:1a: resource 1 [mem 0x81000000-0x811fffff] Jan 14 00:58:02.989200 kernel: pci_bus 0000:1a: resource 2 [mem 0x38c000000000-0x38c7ffffffff 64bit pref] Jan 14 00:58:02.989290 kernel: pci_bus 0000:1b: resource 0 [io 0x4000-0x4fff] Jan 14 00:58:02.989378 kernel: pci_bus 0000:1b: resource 1 [mem 0x80e00000-0x80ffffff] Jan 14 00:58:02.989464 kernel: pci_bus 0000:1b: resource 2 [mem 0x38c800000000-0x38cfffffffff 64bit pref] Jan 14 00:58:02.989556 kernel: pci_bus 0000:1c: resource 0 [io 0x3000-0x3fff] Jan 14 00:58:02.989643 kernel: pci_bus 0000:1c: resource 1 [mem 0x80c00000-0x80dfffff] Jan 14 00:58:02.989729 kernel: pci_bus 0000:1c: resource 2 [mem 0x38d000000000-0x38d7ffffffff 64bit pref] Jan 14 00:58:02.989819 kernel: pci_bus 0000:1d: resource 0 [io 0x2000-0x2fff] Jan 14 00:58:02.989908 kernel: pci_bus 0000:1d: resource 1 [mem 0x80a00000-0x80bfffff] Jan 14 00:58:02.989995 kernel: pci_bus 0000:1d: resource 2 [mem 0x38d800000000-0x38dfffffffff 64bit pref] Jan 14 00:58:02.990094 kernel: pci_bus 0000:1e: resource 0 [io 0x1000-0x1fff] Jan 14 00:58:02.990182 kernel: pci_bus 0000:1e: resource 1 [mem 0x80800000-0x809fffff] Jan 14 00:58:02.990267 kernel: pci_bus 0000:1e: resource 2 [mem 0x38e000000000-0x38e7ffffffff 64bit pref] Jan 14 00:58:02.990279 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Jan 14 00:58:02.990290 kernel: PCI: CLS 0 bytes, default 64 Jan 14 00:58:02.990299 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Jan 14 00:58:02.990307 kernel: software IO TLB: mapped [mem 0x0000000077ede000-0x000000007bede000] (64MB) Jan 14 00:58:02.990316 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Jan 14 00:58:02.990325 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x2113334dc36, max_idle_ns: 440795272915 ns Jan 14 00:58:02.990333 kernel: Initialise system trusted keyrings Jan 14 00:58:02.990341 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Jan 14 00:58:02.990352 kernel: Key type asymmetric registered Jan 14 00:58:02.990360 kernel: Asymmetric key parser 'x509' registered Jan 14 00:58:02.990369 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Jan 14 00:58:02.990377 kernel: io scheduler mq-deadline registered Jan 14 00:58:02.990385 kernel: io scheduler kyber registered Jan 14 00:58:02.990394 kernel: io scheduler bfq registered Jan 14 00:58:02.990510 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24 Jan 14 00:58:02.990613 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24 Jan 14 00:58:02.990703 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25 Jan 14 00:58:02.990789 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25 Jan 14 00:58:02.990879 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26 Jan 14 00:58:02.990968 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26 Jan 14 00:58:02.991079 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27 Jan 14 00:58:02.991167 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27 Jan 14 00:58:02.991257 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28 Jan 14 00:58:02.991344 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28 Jan 14 00:58:02.991430 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29 Jan 14 00:58:02.991523 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29 Jan 14 00:58:02.991615 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30 Jan 14 00:58:02.991705 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30 Jan 14 00:58:02.991797 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31 Jan 14 00:58:02.991890 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31 Jan 14 00:58:02.991903 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Jan 14 00:58:02.991997 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 32 Jan 14 00:58:02.994160 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 32 Jan 14 00:58:02.994284 kernel: pcieport 0000:00:03.1: PME: Signaling with IRQ 33 Jan 14 00:58:02.994386 kernel: pcieport 0000:00:03.1: AER: enabled with IRQ 33 Jan 14 00:58:02.994504 kernel: pcieport 0000:00:03.2: PME: Signaling with IRQ 34 Jan 14 00:58:02.994600 kernel: pcieport 0000:00:03.2: AER: enabled with IRQ 34 Jan 14 00:58:02.994695 kernel: pcieport 0000:00:03.3: PME: Signaling with IRQ 35 Jan 14 00:58:02.994789 kernel: pcieport 0000:00:03.3: AER: enabled with IRQ 35 Jan 14 00:58:02.994885 kernel: pcieport 0000:00:03.4: PME: Signaling with IRQ 36 Jan 14 00:58:02.994981 kernel: pcieport 0000:00:03.4: AER: enabled with IRQ 36 Jan 14 00:58:02.995084 kernel: pcieport 0000:00:03.5: PME: Signaling with IRQ 37 Jan 14 00:58:02.995177 kernel: pcieport 0000:00:03.5: AER: enabled with IRQ 37 Jan 14 00:58:02.995271 kernel: pcieport 0000:00:03.6: PME: Signaling with IRQ 38 Jan 14 00:58:02.995363 kernel: pcieport 0000:00:03.6: AER: enabled with IRQ 38 Jan 14 00:58:02.995460 kernel: pcieport 0000:00:03.7: PME: Signaling with IRQ 39 Jan 14 00:58:02.995556 kernel: pcieport 0000:00:03.7: AER: enabled with IRQ 39 Jan 14 00:58:02.995568 kernel: ACPI: \_SB_.GSIE: Enabled at IRQ 20 Jan 14 00:58:02.995664 kernel: pcieport 0000:00:04.0: PME: Signaling with IRQ 40 Jan 14 00:58:02.995757 kernel: pcieport 0000:00:04.0: AER: enabled with IRQ 40 Jan 14 00:58:02.995851 kernel: pcieport 0000:00:04.1: PME: Signaling with IRQ 41 Jan 14 00:58:02.995943 kernel: pcieport 0000:00:04.1: AER: enabled with IRQ 41 Jan 14 00:58:02.996042 kernel: pcieport 0000:00:04.2: PME: Signaling with IRQ 42 Jan 14 00:58:02.996154 kernel: pcieport 0000:00:04.2: AER: enabled with IRQ 42 Jan 14 00:58:02.996253 kernel: pcieport 0000:00:04.3: PME: Signaling with IRQ 43 Jan 14 00:58:02.996348 kernel: pcieport 0000:00:04.3: AER: enabled with IRQ 43 Jan 14 00:58:02.996446 kernel: pcieport 0000:00:04.4: PME: Signaling with IRQ 44 Jan 14 00:58:02.996542 kernel: pcieport 0000:00:04.4: AER: enabled with IRQ 44 Jan 14 00:58:02.996642 kernel: pcieport 0000:00:04.5: PME: Signaling with IRQ 45 Jan 14 00:58:02.996738 kernel: pcieport 0000:00:04.5: AER: enabled with IRQ 45 Jan 14 00:58:02.996835 kernel: pcieport 0000:00:04.6: PME: Signaling with IRQ 46 Jan 14 00:58:02.996935 kernel: pcieport 0000:00:04.6: AER: enabled with IRQ 46 Jan 14 00:58:02.997032 kernel: pcieport 0000:00:04.7: PME: Signaling with IRQ 47 Jan 14 00:58:02.997135 kernel: pcieport 0000:00:04.7: AER: enabled with IRQ 47 Jan 14 00:58:02.997146 kernel: ACPI: \_SB_.GSIF: Enabled at IRQ 21 Jan 14 00:58:02.997241 kernel: pcieport 0000:00:05.0: PME: Signaling with IRQ 48 Jan 14 00:58:02.997333 kernel: pcieport 0000:00:05.0: AER: enabled with IRQ 48 Jan 14 00:58:02.997427 kernel: pcieport 0000:00:05.1: PME: Signaling with IRQ 49 Jan 14 00:58:02.997519 kernel: pcieport 0000:00:05.1: AER: enabled with IRQ 49 Jan 14 00:58:02.997616 kernel: pcieport 0000:00:05.2: PME: Signaling with IRQ 50 Jan 14 00:58:02.997711 kernel: pcieport 0000:00:05.2: AER: enabled with IRQ 50 Jan 14 00:58:02.997810 kernel: pcieport 0000:00:05.3: PME: Signaling with IRQ 51 Jan 14 00:58:02.997908 kernel: pcieport 0000:00:05.3: AER: enabled with IRQ 51 Jan 14 00:58:02.998005 kernel: pcieport 0000:00:05.4: PME: Signaling with IRQ 52 Jan 14 00:58:02.999012 kernel: pcieport 0000:00:05.4: AER: enabled with IRQ 52 Jan 14 00:58:02.999031 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Jan 14 00:58:02.999041 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Jan 14 00:58:02.999058 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Jan 14 00:58:02.999071 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Jan 14 00:58:02.999079 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Jan 14 00:58:02.999088 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Jan 14 00:58:02.999196 kernel: rtc_cmos 00:03: RTC can wake from S4 Jan 14 00:58:02.999209 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Jan 14 00:58:02.999297 kernel: rtc_cmos 00:03: registered as rtc0 Jan 14 00:58:02.999389 kernel: rtc_cmos 00:03: setting system clock to 2026-01-14T00:58:01 UTC (1768352281) Jan 14 00:58:02.999483 kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram Jan 14 00:58:02.999493 kernel: intel_pstate: CPU model not supported Jan 14 00:58:02.999502 kernel: efifb: probing for efifb Jan 14 00:58:02.999510 kernel: efifb: framebuffer at 0x80000000, using 4000k, total 4000k Jan 14 00:58:02.999519 kernel: efifb: mode is 1280x800x32, linelength=5120, pages=1 Jan 14 00:58:02.999528 kernel: efifb: scrolling: redraw Jan 14 00:58:02.999536 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 Jan 14 00:58:02.999547 kernel: Console: switching to colour frame buffer device 160x50 Jan 14 00:58:02.999557 kernel: fb0: EFI VGA frame buffer device Jan 14 00:58:02.999565 kernel: pstore: Using crash dump compression: deflate Jan 14 00:58:02.999574 kernel: pstore: Registered efi_pstore as persistent store backend Jan 14 00:58:02.999583 kernel: NET: Registered PF_INET6 protocol family Jan 14 00:58:02.999592 kernel: Segment Routing with IPv6 Jan 14 00:58:02.999600 kernel: In-situ OAM (IOAM) with IPv6 Jan 14 00:58:02.999610 kernel: NET: Registered PF_PACKET protocol family Jan 14 00:58:02.999619 kernel: Key type dns_resolver registered Jan 14 00:58:02.999628 kernel: IPI shorthand broadcast: enabled Jan 14 00:58:02.999636 kernel: sched_clock: Marking stable (2485001600, 155798331)->(2742594897, -101794966) Jan 14 00:58:02.999644 kernel: registered taskstats version 1 Jan 14 00:58:02.999653 kernel: Loading compiled-in X.509 certificates Jan 14 00:58:02.999662 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.65-flatcar: 58a78462583b088d099087e6f2d97e37d80e06bb' Jan 14 00:58:02.999672 kernel: Demotion targets for Node 0: null Jan 14 00:58:02.999681 kernel: Key type .fscrypt registered Jan 14 00:58:02.999689 kernel: Key type fscrypt-provisioning registered Jan 14 00:58:02.999697 kernel: ima: No TPM chip found, activating TPM-bypass! Jan 14 00:58:02.999706 kernel: ima: Allocated hash algorithm: sha1 Jan 14 00:58:02.999714 kernel: ima: No architecture policies found Jan 14 00:58:02.999723 kernel: clk: Disabling unused clocks Jan 14 00:58:02.999734 kernel: Freeing unused kernel image (initmem) memory: 15536K Jan 14 00:58:02.999742 kernel: Write protecting the kernel read-only data: 47104k Jan 14 00:58:02.999751 kernel: Freeing unused kernel image (rodata/data gap) memory: 1132K Jan 14 00:58:02.999760 kernel: Run /init as init process Jan 14 00:58:02.999768 kernel: with arguments: Jan 14 00:58:02.999777 kernel: /init Jan 14 00:58:02.999786 kernel: with environment: Jan 14 00:58:02.999794 kernel: HOME=/ Jan 14 00:58:02.999804 kernel: TERM=linux Jan 14 00:58:02.999813 kernel: SCSI subsystem initialized Jan 14 00:58:02.999821 kernel: libata version 3.00 loaded. Jan 14 00:58:02.999923 kernel: ahci 0000:00:1f.2: version 3.0 Jan 14 00:58:02.999935 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Jan 14 00:58:03.000030 kernel: ahci 0000:00:1f.2: AHCI vers 0001.0000, 32 command slots, 1.5 Gbps, SATA mode Jan 14 00:58:03.000145 kernel: ahci 0000:00:1f.2: 6/6 ports implemented (port mask 0x3f) Jan 14 00:58:03.000899 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Jan 14 00:58:03.001647 kernel: scsi host0: ahci Jan 14 00:58:03.001775 kernel: scsi host1: ahci Jan 14 00:58:03.001908 kernel: scsi host2: ahci Jan 14 00:58:03.002010 kernel: scsi host3: ahci Jan 14 00:58:03.002621 kernel: scsi host4: ahci Jan 14 00:58:03.002733 kernel: scsi host5: ahci Jan 14 00:58:03.002744 kernel: ata1: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380100 irq 55 lpm-pol 1 Jan 14 00:58:03.002754 kernel: ata2: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380180 irq 55 lpm-pol 1 Jan 14 00:58:03.002763 kernel: ata3: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380200 irq 55 lpm-pol 1 Jan 14 00:58:03.002772 kernel: ata4: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380280 irq 55 lpm-pol 1 Jan 14 00:58:03.002784 kernel: ata5: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380300 irq 55 lpm-pol 1 Jan 14 00:58:03.002793 kernel: ata6: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380380 irq 55 lpm-pol 1 Jan 14 00:58:03.002801 kernel: ata3: SATA link down (SStatus 0 SControl 300) Jan 14 00:58:03.002810 kernel: ata5: SATA link down (SStatus 0 SControl 300) Jan 14 00:58:03.002819 kernel: ata4: SATA link down (SStatus 0 SControl 300) Jan 14 00:58:03.002829 kernel: ata1: SATA link down (SStatus 0 SControl 300) Jan 14 00:58:03.002838 kernel: ata2: SATA link down (SStatus 0 SControl 300) Jan 14 00:58:03.002848 kernel: ata6: SATA link down (SStatus 0 SControl 300) Jan 14 00:58:03.002857 kernel: ACPI: bus type USB registered Jan 14 00:58:03.002865 kernel: usbcore: registered new interface driver usbfs Jan 14 00:58:03.002874 kernel: usbcore: registered new interface driver hub Jan 14 00:58:03.002883 kernel: usbcore: registered new device driver usb Jan 14 00:58:03.002991 kernel: uhci_hcd 0000:02:01.0: UHCI Host Controller Jan 14 00:58:03.004155 kernel: uhci_hcd 0000:02:01.0: new USB bus registered, assigned bus number 1 Jan 14 00:58:03.004277 kernel: uhci_hcd 0000:02:01.0: detected 2 ports Jan 14 00:58:03.004383 kernel: uhci_hcd 0000:02:01.0: irq 22, io port 0x00006000 Jan 14 00:58:03.004513 kernel: hub 1-0:1.0: USB hub found Jan 14 00:58:03.004624 kernel: hub 1-0:1.0: 2 ports detected Jan 14 00:58:03.004734 kernel: virtio_blk virtio2: 2/0/0 default/read/poll queues Jan 14 00:58:03.004833 kernel: virtio_blk virtio2: [vda] 104857600 512-byte logical blocks (53.7 GB/50.0 GiB) Jan 14 00:58:03.004848 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Jan 14 00:58:03.004858 kernel: GPT:25804799 != 104857599 Jan 14 00:58:03.004868 kernel: GPT:Alternate GPT header not at the end of the disk. Jan 14 00:58:03.004877 kernel: GPT:25804799 != 104857599 Jan 14 00:58:03.004885 kernel: GPT: Use GNU Parted to correct GPT errors. Jan 14 00:58:03.004893 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Jan 14 00:58:03.004904 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Jan 14 00:58:03.004913 kernel: device-mapper: uevent: version 1.0.3 Jan 14 00:58:03.004922 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Jan 14 00:58:03.004932 kernel: device-mapper: verity: sha256 using shash "sha256-generic" Jan 14 00:58:03.004941 kernel: raid6: avx512x4 gen() 44289 MB/s Jan 14 00:58:03.004950 kernel: raid6: avx512x2 gen() 45385 MB/s Jan 14 00:58:03.004958 kernel: raid6: avx512x1 gen() 45376 MB/s Jan 14 00:58:03.004970 kernel: raid6: avx2x4 gen() 35657 MB/s Jan 14 00:58:03.004978 kernel: raid6: avx2x2 gen() 35941 MB/s Jan 14 00:58:03.004987 kernel: raid6: avx2x1 gen() 33246 MB/s Jan 14 00:58:03.004996 kernel: raid6: using algorithm avx512x2 gen() 45385 MB/s Jan 14 00:58:03.005005 kernel: raid6: .... xor() 28417 MB/s, rmw enabled Jan 14 00:58:03.005015 kernel: raid6: using avx512x2 recovery algorithm Jan 14 00:58:03.005024 kernel: xor: automatically using best checksumming function avx Jan 14 00:58:03.005034 kernel: Btrfs loaded, zoned=no, fsverity=no Jan 14 00:58:03.005043 kernel: BTRFS: device fsid 315c4ba2-2b68-4ff5-9a58-ddeab520c9ac devid 1 transid 33 /dev/mapper/usr (253:0) scanned by mount (205) Jan 14 00:58:03.005186 kernel: BTRFS info (device dm-0): first mount of filesystem 315c4ba2-2b68-4ff5-9a58-ddeab520c9ac Jan 14 00:58:03.005195 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Jan 14 00:58:03.005332 kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd Jan 14 00:58:03.005346 kernel: BTRFS info (device dm-0): disabling log replay at mount time Jan 14 00:58:03.005359 kernel: BTRFS info (device dm-0): enabling free space tree Jan 14 00:58:03.005367 kernel: loop: module loaded Jan 14 00:58:03.005376 kernel: loop0: detected capacity change from 0 to 100552 Jan 14 00:58:03.005386 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Jan 14 00:58:03.005396 systemd[1]: Successfully made /usr/ read-only. Jan 14 00:58:03.005408 systemd[1]: systemd 257.9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +IPE +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -BTF -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jan 14 00:58:03.005420 systemd[1]: Detected virtualization kvm. Jan 14 00:58:03.005429 systemd[1]: Detected architecture x86-64. Jan 14 00:58:03.005438 systemd[1]: Running in initrd. Jan 14 00:58:03.005447 systemd[1]: No hostname configured, using default hostname. Jan 14 00:58:03.005456 systemd[1]: Hostname set to . Jan 14 00:58:03.005465 systemd[1]: Initializing machine ID from SMBIOS/DMI UUID. Jan 14 00:58:03.005476 systemd[1]: Queued start job for default target initrd.target. Jan 14 00:58:03.005485 systemd[1]: Unnecessary job was removed for dev-mapper-usr.device - /dev/mapper/usr. Jan 14 00:58:03.005494 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 14 00:58:03.005504 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 14 00:58:03.005514 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Jan 14 00:58:03.005523 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 14 00:58:03.005534 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Jan 14 00:58:03.005544 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Jan 14 00:58:03.005555 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 14 00:58:03.005939 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 14 00:58:03.005949 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Jan 14 00:58:03.005958 systemd[1]: Reached target paths.target - Path Units. Jan 14 00:58:03.005970 systemd[1]: Reached target slices.target - Slice Units. Jan 14 00:58:03.005979 systemd[1]: Reached target swap.target - Swaps. Jan 14 00:58:03.005988 systemd[1]: Reached target timers.target - Timer Units. Jan 14 00:58:03.005997 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Jan 14 00:58:03.006006 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 14 00:58:03.006015 systemd[1]: Listening on systemd-journald-audit.socket - Journal Audit Socket. Jan 14 00:58:03.006025 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Jan 14 00:58:03.006036 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Jan 14 00:58:03.006518 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 14 00:58:03.006533 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 14 00:58:03.006542 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 14 00:58:03.006552 systemd[1]: Reached target sockets.target - Socket Units. Jan 14 00:58:03.006561 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Jan 14 00:58:03.006570 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Jan 14 00:58:03.006581 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 14 00:58:03.006589 systemd[1]: Finished network-cleanup.service - Network Cleanup. Jan 14 00:58:03.006599 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Jan 14 00:58:03.006608 systemd[1]: Starting systemd-fsck-usr.service... Jan 14 00:58:03.006617 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 14 00:58:03.006625 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 14 00:58:03.006636 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 14 00:58:03.006645 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Jan 14 00:58:03.006655 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 14 00:58:03.006664 systemd[1]: Finished systemd-fsck-usr.service. Jan 14 00:58:03.006674 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jan 14 00:58:03.006706 systemd-journald[342]: Collecting audit messages is enabled. Jan 14 00:58:03.006727 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 14 00:58:03.006738 kernel: audit: type=1130 audit(1768352282.945:2): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:03.006748 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 14 00:58:03.006757 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Jan 14 00:58:03.006766 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 14 00:58:03.006774 kernel: Bridge firewalling registered Jan 14 00:58:03.006783 kernel: audit: type=1130 audit(1768352282.968:3): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:03.006791 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 14 00:58:03.006802 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 14 00:58:03.006810 kernel: audit: type=1130 audit(1768352282.981:4): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:03.006819 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 14 00:58:03.006827 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 14 00:58:03.006837 kernel: audit: type=1130 audit(1768352282.996:5): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:03.006846 systemd-journald[342]: Journal started Jan 14 00:58:03.006867 systemd-journald[342]: Runtime Journal (/run/log/journal/48d68fca6aec498b9f16c652f99d0a71) is 8M, max 77.9M, 69.9M free. Jan 14 00:58:02.945000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:02.968000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:02.981000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:02.996000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:02.967748 systemd-modules-load[345]: Inserted module 'br_netfilter' Jan 14 00:58:03.010192 systemd[1]: Started systemd-journald.service - Journal Service. Jan 14 00:58:03.010000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:03.015067 kernel: audit: type=1130 audit(1768352283.010:6): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:03.015352 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 14 00:58:03.020107 kernel: audit: type=1130 audit(1768352283.016:7): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:03.016000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:03.016606 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 14 00:58:03.024210 kernel: audit: type=1130 audit(1768352283.020:8): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:03.020000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:03.025204 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Jan 14 00:58:03.026000 audit: BPF prog-id=6 op=LOAD Jan 14 00:58:03.029064 kernel: audit: type=1334 audit(1768352283.026:9): prog-id=6 op=LOAD Jan 14 00:58:03.027589 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 14 00:58:03.030180 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 14 00:58:03.042328 systemd-tmpfiles[377]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Jan 14 00:58:03.045000 dracut-cmdline[375]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack verity.usrhash=6d34ab71a3dc5a0ab37eb2c851228af18a1e24f648223df9a1099dbd7db2cfcf Jan 14 00:58:03.052212 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 14 00:58:03.061108 kernel: audit: type=1130 audit(1768352283.052:10): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:03.052000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:03.085254 systemd-resolved[376]: Positive Trust Anchors: Jan 14 00:58:03.085263 systemd-resolved[376]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 14 00:58:03.085267 systemd-resolved[376]: . IN DS 38696 8 2 683d2d0acb8c9b712a1948b27f741219298d0a450d612c483af444a4c0fb2b16 Jan 14 00:58:03.085294 systemd-resolved[376]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 14 00:58:03.115544 systemd-resolved[376]: Defaulting to hostname 'linux'. Jan 14 00:58:03.117139 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 14 00:58:03.117000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:03.118557 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 14 00:58:03.136074 kernel: Loading iSCSI transport class v2.0-870. Jan 14 00:58:03.152072 kernel: iscsi: registered transport (tcp) Jan 14 00:58:03.174625 kernel: iscsi: registered transport (qla4xxx) Jan 14 00:58:03.174678 kernel: QLogic iSCSI HBA Driver Jan 14 00:58:03.198946 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jan 14 00:58:03.219575 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jan 14 00:58:03.220000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:03.222427 systemd[1]: Reached target network-pre.target - Preparation for Network. Jan 14 00:58:03.258767 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Jan 14 00:58:03.259000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:03.260571 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Jan 14 00:58:03.261613 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Jan 14 00:58:03.289810 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Jan 14 00:58:03.290000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:03.291000 audit: BPF prog-id=7 op=LOAD Jan 14 00:58:03.291000 audit: BPF prog-id=8 op=LOAD Jan 14 00:58:03.293160 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 14 00:58:03.314548 systemd-udevd[618]: Using default interface naming scheme 'v257'. Jan 14 00:58:03.322459 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 14 00:58:03.322000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:03.324776 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Jan 14 00:58:03.345479 dracut-pre-trigger[684]: rd.md=0: removing MD RAID activation Jan 14 00:58:03.351475 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 14 00:58:03.351000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:03.352000 audit: BPF prog-id=9 op=LOAD Jan 14 00:58:03.354498 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 14 00:58:03.370803 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Jan 14 00:58:03.373000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:03.374578 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 14 00:58:03.395248 systemd-networkd[734]: lo: Link UP Jan 14 00:58:03.395254 systemd-networkd[734]: lo: Gained carrier Jan 14 00:58:03.397000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:03.397122 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 14 00:58:03.397545 systemd[1]: Reached target network.target - Network. Jan 14 00:58:03.457437 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 14 00:58:03.457000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:03.459789 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Jan 14 00:58:03.566933 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Jan 14 00:58:03.577569 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Jan 14 00:58:03.592303 kernel: cryptd: max_cpu_qlen set to 1000 Jan 14 00:58:03.601087 kernel: hid: raw HID events driver (C) Jiri Kosina Jan 14 00:58:03.602902 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Jan 14 00:58:03.611426 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Jan 14 00:58:03.616122 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Jan 14 00:58:03.617300 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 14 00:58:03.617387 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 14 00:58:03.619000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:03.619257 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Jan 14 00:58:03.621793 kernel: AES CTR mode by8 optimization enabled Jan 14 00:58:03.624354 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 14 00:58:03.634961 kernel: usbcore: registered new interface driver usbhid Jan 14 00:58:03.635009 kernel: usbhid: USB HID core driver Jan 14 00:58:03.648579 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 14 00:58:03.657657 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input2 Jan 14 00:58:03.648653 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 14 00:58:03.658000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:03.658000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:03.660238 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 14 00:58:03.679552 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.0/0000:01:00.0/0000:02:01.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input3 Jan 14 00:58:03.679603 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:01.0-1/input0 Jan 14 00:58:03.679789 disk-uuid[883]: Primary Header is updated. Jan 14 00:58:03.679789 disk-uuid[883]: Secondary Entries is updated. Jan 14 00:58:03.679789 disk-uuid[883]: Secondary Header is updated. Jan 14 00:58:03.677413 systemd-networkd[734]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 14 00:58:03.677417 systemd-networkd[734]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 14 00:58:03.677860 systemd-networkd[734]: eth0: Link UP Jan 14 00:58:03.680467 systemd-networkd[734]: eth0: Gained carrier Jan 14 00:58:03.680478 systemd-networkd[734]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 14 00:58:03.702624 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 14 00:58:03.703000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:03.712122 systemd-networkd[734]: eth0: DHCPv4 address 10.0.21.32/25, gateway 10.0.21.1 acquired from 10.0.21.1 Jan 14 00:58:03.791317 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Jan 14 00:58:03.792000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:03.793152 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Jan 14 00:58:03.794073 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 14 00:58:03.794900 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 14 00:58:03.796536 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Jan 14 00:58:03.811331 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Jan 14 00:58:03.811000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:04.742911 disk-uuid[886]: Warning: The kernel is still using the old partition table. Jan 14 00:58:04.742911 disk-uuid[886]: The new table will be used at the next reboot or after you Jan 14 00:58:04.742911 disk-uuid[886]: run partprobe(8) or kpartx(8) Jan 14 00:58:04.742911 disk-uuid[886]: The operation has completed successfully. Jan 14 00:58:04.747178 systemd[1]: disk-uuid.service: Deactivated successfully. Jan 14 00:58:04.755372 kernel: kauditd_printk_skb: 18 callbacks suppressed Jan 14 00:58:04.755399 kernel: audit: type=1130 audit(1768352284.747:29): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:04.755413 kernel: audit: type=1131 audit(1768352284.747:30): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:04.747000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:04.747000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:04.747261 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Jan 14 00:58:04.748864 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Jan 14 00:58:04.798082 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (913) Jan 14 00:58:04.801815 kernel: BTRFS info (device vda6): first mount of filesystem 87cf3d96-2540-4b91-98c0-7ae2e759a282 Jan 14 00:58:04.801943 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Jan 14 00:58:04.810343 kernel: BTRFS info (device vda6): turning on async discard Jan 14 00:58:04.810420 kernel: BTRFS info (device vda6): enabling free space tree Jan 14 00:58:04.816081 kernel: BTRFS info (device vda6): last unmount of filesystem 87cf3d96-2540-4b91-98c0-7ae2e759a282 Jan 14 00:58:04.816911 systemd[1]: Finished ignition-setup.service - Ignition (setup). Jan 14 00:58:04.821152 kernel: audit: type=1130 audit(1768352284.817:31): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:04.817000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:04.818446 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Jan 14 00:58:04.884238 systemd-networkd[734]: eth0: Gained IPv6LL Jan 14 00:58:05.014200 ignition[932]: Ignition 2.24.0 Jan 14 00:58:05.014211 ignition[932]: Stage: fetch-offline Jan 14 00:58:05.014250 ignition[932]: no configs at "/usr/lib/ignition/base.d" Jan 14 00:58:05.014259 ignition[932]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 14 00:58:05.017000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:05.014338 ignition[932]: parsed url from cmdline: "" Jan 14 00:58:05.017284 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Jan 14 00:58:05.022150 kernel: audit: type=1130 audit(1768352285.017:32): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:05.014341 ignition[932]: no config URL provided Jan 14 00:58:05.021683 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Jan 14 00:58:05.014345 ignition[932]: reading system config file "/usr/lib/ignition/user.ign" Jan 14 00:58:05.014353 ignition[932]: no config at "/usr/lib/ignition/user.ign" Jan 14 00:58:05.014358 ignition[932]: failed to fetch config: resource requires networking Jan 14 00:58:05.014495 ignition[932]: Ignition finished successfully Jan 14 00:58:05.048119 ignition[942]: Ignition 2.24.0 Jan 14 00:58:05.048129 ignition[942]: Stage: fetch Jan 14 00:58:05.048262 ignition[942]: no configs at "/usr/lib/ignition/base.d" Jan 14 00:58:05.048269 ignition[942]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 14 00:58:05.048351 ignition[942]: parsed url from cmdline: "" Jan 14 00:58:05.048354 ignition[942]: no config URL provided Jan 14 00:58:05.048362 ignition[942]: reading system config file "/usr/lib/ignition/user.ign" Jan 14 00:58:05.048368 ignition[942]: no config at "/usr/lib/ignition/user.ign" Jan 14 00:58:05.048440 ignition[942]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 Jan 14 00:58:05.048849 ignition[942]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Jan 14 00:58:05.048868 ignition[942]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Jan 14 00:58:05.366255 ignition[942]: GET result: OK Jan 14 00:58:05.367089 ignition[942]: parsing config with SHA512: 48cd96b9cde5508c290dd2e3d3f57175fb02b1f34dbd1825be9a39e4f01ca19d51d9083eee3b0540dcc23622a41e3f4e6cd8f3e24037ca9729bbe43fd367ba5b Jan 14 00:58:05.371859 unknown[942]: fetched base config from "system" Jan 14 00:58:05.371867 unknown[942]: fetched base config from "system" Jan 14 00:58:05.372197 ignition[942]: fetch: fetch complete Jan 14 00:58:05.371872 unknown[942]: fetched user config from "openstack" Jan 14 00:58:05.377918 kernel: audit: type=1130 audit(1768352285.374:33): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:05.374000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:05.372201 ignition[942]: fetch: fetch passed Jan 14 00:58:05.374085 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Jan 14 00:58:05.372233 ignition[942]: Ignition finished successfully Jan 14 00:58:05.377171 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Jan 14 00:58:05.400910 ignition[948]: Ignition 2.24.0 Jan 14 00:58:05.400921 ignition[948]: Stage: kargs Jan 14 00:58:05.401068 ignition[948]: no configs at "/usr/lib/ignition/base.d" Jan 14 00:58:05.401076 ignition[948]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 14 00:58:05.401673 ignition[948]: kargs: kargs passed Jan 14 00:58:05.401709 ignition[948]: Ignition finished successfully Jan 14 00:58:05.404000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:05.403930 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Jan 14 00:58:05.408589 kernel: audit: type=1130 audit(1768352285.404:34): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:05.407160 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Jan 14 00:58:05.434365 ignition[954]: Ignition 2.24.0 Jan 14 00:58:05.434375 ignition[954]: Stage: disks Jan 14 00:58:05.434529 ignition[954]: no configs at "/usr/lib/ignition/base.d" Jan 14 00:58:05.434537 ignition[954]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 14 00:58:05.435693 ignition[954]: disks: disks passed Jan 14 00:58:05.435732 ignition[954]: Ignition finished successfully Jan 14 00:58:05.438261 systemd[1]: Finished ignition-disks.service - Ignition (disks). Jan 14 00:58:05.438000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:05.439130 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Jan 14 00:58:05.443069 kernel: audit: type=1130 audit(1768352285.438:35): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:05.442781 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Jan 14 00:58:05.443112 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 14 00:58:05.443383 systemd[1]: Reached target sysinit.target - System Initialization. Jan 14 00:58:05.443936 systemd[1]: Reached target basic.target - Basic System. Jan 14 00:58:05.445340 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Jan 14 00:58:05.494552 systemd-fsck[962]: ROOT: clean, 15/1631200 files, 112378/1617920 blocks Jan 14 00:58:05.497246 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Jan 14 00:58:05.501414 kernel: audit: type=1130 audit(1768352285.497:36): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:05.497000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:05.500078 systemd[1]: Mounting sysroot.mount - /sysroot... Jan 14 00:58:05.682228 kernel: EXT4-fs (vda9): mounted filesystem 6efdc615-0e3c-4caf-8d0b-1f38e5c59ef0 r/w with ordered data mode. Quota mode: none. Jan 14 00:58:05.682842 systemd[1]: Mounted sysroot.mount - /sysroot. Jan 14 00:58:05.683827 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Jan 14 00:58:05.688321 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 14 00:58:05.690183 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Jan 14 00:58:05.690796 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Jan 14 00:58:05.698165 systemd[1]: Starting flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent... Jan 14 00:58:05.698634 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Jan 14 00:58:05.698666 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Jan 14 00:58:05.702932 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Jan 14 00:58:05.713178 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Jan 14 00:58:05.721079 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (970) Jan 14 00:58:05.724807 kernel: BTRFS info (device vda6): first mount of filesystem 87cf3d96-2540-4b91-98c0-7ae2e759a282 Jan 14 00:58:05.724837 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Jan 14 00:58:05.736404 kernel: BTRFS info (device vda6): turning on async discard Jan 14 00:58:05.736444 kernel: BTRFS info (device vda6): enabling free space tree Jan 14 00:58:05.741996 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 14 00:58:05.801093 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 14 00:58:05.935023 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Jan 14 00:58:05.939547 kernel: audit: type=1130 audit(1768352285.935:37): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:05.935000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:05.937152 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Jan 14 00:58:05.940812 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Jan 14 00:58:05.952564 systemd[1]: sysroot-oem.mount: Deactivated successfully. Jan 14 00:58:05.955168 kernel: BTRFS info (device vda6): last unmount of filesystem 87cf3d96-2540-4b91-98c0-7ae2e759a282 Jan 14 00:58:05.980242 ignition[1072]: INFO : Ignition 2.24.0 Jan 14 00:58:05.980636 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Jan 14 00:58:05.986043 kernel: audit: type=1130 audit(1768352285.981:38): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:05.981000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:05.986115 ignition[1072]: INFO : Stage: mount Jan 14 00:58:05.986115 ignition[1072]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 14 00:58:05.986115 ignition[1072]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 14 00:58:05.986115 ignition[1072]: INFO : mount: mount passed Jan 14 00:58:05.986115 ignition[1072]: INFO : Ignition finished successfully Jan 14 00:58:05.987000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:05.986820 systemd[1]: Finished ignition-mount.service - Ignition (mount). Jan 14 00:58:06.845079 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 14 00:58:08.854071 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 14 00:58:12.861074 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 14 00:58:12.866371 coreos-metadata[972]: Jan 14 00:58:12.866 WARN failed to locate config-drive, using the metadata service API instead Jan 14 00:58:12.879589 coreos-metadata[972]: Jan 14 00:58:12.879 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Jan 14 00:58:13.022443 coreos-metadata[972]: Jan 14 00:58:13.022 INFO Fetch successful Jan 14 00:58:13.022443 coreos-metadata[972]: Jan 14 00:58:13.022 INFO wrote hostname ci-4547-0-0-n-de0c74fc75 to /sysroot/etc/hostname Jan 14 00:58:13.024705 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. Jan 14 00:58:13.029425 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:58:13.029449 kernel: audit: type=1130 audit(1768352293.025:40): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:13.029462 kernel: audit: type=1131 audit(1768352293.027:41): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:13.025000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:13.027000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:13.025357 systemd[1]: Finished flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent. Jan 14 00:58:13.032135 systemd[1]: Starting ignition-files.service - Ignition (files)... Jan 14 00:58:13.052984 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 14 00:58:13.083072 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (1089) Jan 14 00:58:13.087386 kernel: BTRFS info (device vda6): first mount of filesystem 87cf3d96-2540-4b91-98c0-7ae2e759a282 Jan 14 00:58:13.087422 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Jan 14 00:58:13.095679 kernel: BTRFS info (device vda6): turning on async discard Jan 14 00:58:13.095723 kernel: BTRFS info (device vda6): enabling free space tree Jan 14 00:58:13.097235 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 14 00:58:13.125025 ignition[1107]: INFO : Ignition 2.24.0 Jan 14 00:58:13.125025 ignition[1107]: INFO : Stage: files Jan 14 00:58:13.126193 ignition[1107]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 14 00:58:13.126193 ignition[1107]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 14 00:58:13.126193 ignition[1107]: DEBUG : files: compiled without relabeling support, skipping Jan 14 00:58:13.127389 ignition[1107]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Jan 14 00:58:13.127389 ignition[1107]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Jan 14 00:58:13.134199 ignition[1107]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Jan 14 00:58:13.134724 ignition[1107]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Jan 14 00:58:13.134724 ignition[1107]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Jan 14 00:58:13.134546 unknown[1107]: wrote ssh authorized keys file for user: core Jan 14 00:58:13.137389 ignition[1107]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Jan 14 00:58:13.137389 ignition[1107]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-amd64.tar.gz: attempt #1 Jan 14 00:58:13.194937 ignition[1107]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Jan 14 00:58:13.313503 ignition[1107]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Jan 14 00:58:13.313503 ignition[1107]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Jan 14 00:58:13.315193 ignition[1107]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Jan 14 00:58:13.315193 ignition[1107]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Jan 14 00:58:13.315193 ignition[1107]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Jan 14 00:58:13.315193 ignition[1107]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 14 00:58:13.315193 ignition[1107]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 14 00:58:13.315193 ignition[1107]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 14 00:58:13.315193 ignition[1107]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 14 00:58:13.318025 ignition[1107]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Jan 14 00:58:13.318025 ignition[1107]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Jan 14 00:58:13.318025 ignition[1107]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.34.1-x86-64.raw" Jan 14 00:58:13.318025 ignition[1107]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.34.1-x86-64.raw" Jan 14 00:58:13.318025 ignition[1107]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.34.1-x86-64.raw" Jan 14 00:58:13.318025 ignition[1107]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.34.1-x86-64.raw: attempt #1 Jan 14 00:58:13.568167 ignition[1107]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Jan 14 00:58:14.147645 ignition[1107]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.34.1-x86-64.raw" Jan 14 00:58:14.147645 ignition[1107]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Jan 14 00:58:14.150152 ignition[1107]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 14 00:58:14.153220 ignition[1107]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 14 00:58:14.153220 ignition[1107]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Jan 14 00:58:14.153220 ignition[1107]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Jan 14 00:58:14.154822 ignition[1107]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Jan 14 00:58:14.154822 ignition[1107]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Jan 14 00:58:14.154822 ignition[1107]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Jan 14 00:58:14.154822 ignition[1107]: INFO : files: files passed Jan 14 00:58:14.154822 ignition[1107]: INFO : Ignition finished successfully Jan 14 00:58:14.161771 kernel: audit: type=1130 audit(1768352294.155:42): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:14.155000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:14.155208 systemd[1]: Finished ignition-files.service - Ignition (files). Jan 14 00:58:14.156851 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Jan 14 00:58:14.163188 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Jan 14 00:58:14.170416 systemd[1]: ignition-quench.service: Deactivated successfully. Jan 14 00:58:14.171009 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Jan 14 00:58:14.171000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:14.171000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:14.176762 kernel: audit: type=1130 audit(1768352294.171:43): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:14.176788 kernel: audit: type=1131 audit(1768352294.171:44): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:14.181115 initrd-setup-root-after-ignition[1138]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 14 00:58:14.181741 initrd-setup-root-after-ignition[1142]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 14 00:58:14.182801 initrd-setup-root-after-ignition[1138]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Jan 14 00:58:14.183949 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 14 00:58:14.184000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:14.185243 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Jan 14 00:58:14.189380 kernel: audit: type=1130 audit(1768352294.184:45): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:14.190617 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Jan 14 00:58:14.242133 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Jan 14 00:58:14.242243 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Jan 14 00:58:14.250773 kernel: audit: type=1130 audit(1768352294.243:46): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:14.250802 kernel: audit: type=1131 audit(1768352294.243:47): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:14.243000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:14.243000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:14.243639 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Jan 14 00:58:14.251297 systemd[1]: Reached target initrd.target - Initrd Default Target. Jan 14 00:58:14.252382 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Jan 14 00:58:14.253245 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Jan 14 00:58:14.291490 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 14 00:58:14.296183 kernel: audit: type=1130 audit(1768352294.292:48): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:14.292000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:14.294188 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Jan 14 00:58:14.323639 systemd[1]: Unnecessary job was removed for dev-mapper-usr.device - /dev/mapper/usr. Jan 14 00:58:14.324503 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Jan 14 00:58:14.325657 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 14 00:58:14.326183 systemd[1]: Stopped target timers.target - Timer Units. Jan 14 00:58:14.327871 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Jan 14 00:58:14.332110 kernel: audit: type=1131 audit(1768352294.328:49): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:14.328000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:14.327982 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 14 00:58:14.332220 systemd[1]: Stopped target initrd.target - Initrd Default Target. Jan 14 00:58:14.333312 systemd[1]: Stopped target basic.target - Basic System. Jan 14 00:58:14.334219 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Jan 14 00:58:14.335094 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Jan 14 00:58:14.335985 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Jan 14 00:58:14.337072 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Jan 14 00:58:14.338117 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Jan 14 00:58:14.339060 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Jan 14 00:58:14.339930 systemd[1]: Stopped target sysinit.target - System Initialization. Jan 14 00:58:14.340716 systemd[1]: Stopped target local-fs.target - Local File Systems. Jan 14 00:58:14.341586 systemd[1]: Stopped target swap.target - Swaps. Jan 14 00:58:14.342408 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Jan 14 00:58:14.343000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:14.342522 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Jan 14 00:58:14.343618 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Jan 14 00:58:14.344499 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 14 00:58:14.345186 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Jan 14 00:58:14.345979 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 14 00:58:14.346500 systemd[1]: dracut-initqueue.service: Deactivated successfully. Jan 14 00:58:14.347000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:14.346617 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Jan 14 00:58:14.347640 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Jan 14 00:58:14.348000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:14.347724 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 14 00:58:14.349000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:14.348516 systemd[1]: ignition-files.service: Deactivated successfully. Jan 14 00:58:14.348595 systemd[1]: Stopped ignition-files.service - Ignition (files). Jan 14 00:58:14.350477 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Jan 14 00:58:14.352178 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Jan 14 00:58:14.352928 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Jan 14 00:58:14.353038 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 14 00:58:14.356000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:14.356884 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Jan 14 00:58:14.356974 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Jan 14 00:58:14.358359 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Jan 14 00:58:14.358456 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Jan 14 00:58:14.358000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:14.359000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:14.368260 systemd[1]: initrd-cleanup.service: Deactivated successfully. Jan 14 00:58:14.368358 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Jan 14 00:58:14.369000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:14.369000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:14.376488 ignition[1163]: INFO : Ignition 2.24.0 Jan 14 00:58:14.376488 ignition[1163]: INFO : Stage: umount Jan 14 00:58:14.378109 ignition[1163]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 14 00:58:14.378109 ignition[1163]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 14 00:58:14.378109 ignition[1163]: INFO : umount: umount passed Jan 14 00:58:14.378109 ignition[1163]: INFO : Ignition finished successfully Jan 14 00:58:14.380000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:14.379758 systemd[1]: ignition-mount.service: Deactivated successfully. Jan 14 00:58:14.380285 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Jan 14 00:58:14.382096 systemd[1]: ignition-disks.service: Deactivated successfully. Jan 14 00:58:14.382000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:14.382173 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Jan 14 00:58:14.383000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:14.382807 systemd[1]: ignition-kargs.service: Deactivated successfully. Jan 14 00:58:14.382849 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Jan 14 00:58:14.384000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:14.384116 systemd[1]: ignition-fetch.service: Deactivated successfully. Jan 14 00:58:14.384164 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Jan 14 00:58:14.386000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:14.384901 systemd[1]: Stopped target network.target - Network. Jan 14 00:58:14.385566 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Jan 14 00:58:14.385610 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Jan 14 00:58:14.386296 systemd[1]: Stopped target paths.target - Path Units. Jan 14 00:58:14.387388 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Jan 14 00:58:14.391100 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 14 00:58:14.391739 systemd[1]: Stopped target slices.target - Slice Units. Jan 14 00:58:14.392122 systemd[1]: Stopped target sockets.target - Socket Units. Jan 14 00:58:14.392788 systemd[1]: iscsid.socket: Deactivated successfully. Jan 14 00:58:14.392825 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Jan 14 00:58:14.393494 systemd[1]: iscsiuio.socket: Deactivated successfully. Jan 14 00:58:14.393533 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 14 00:58:14.395000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:14.394135 systemd[1]: systemd-journald-audit.socket: Deactivated successfully. Jan 14 00:58:14.395000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup-pre comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:14.394158 systemd[1]: Closed systemd-journald-audit.socket - Journal Audit Socket. Jan 14 00:58:14.394767 systemd[1]: ignition-setup.service: Deactivated successfully. Jan 14 00:58:14.394811 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Jan 14 00:58:14.395378 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Jan 14 00:58:14.395420 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Jan 14 00:58:14.400000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:14.396078 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Jan 14 00:58:14.396627 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Jan 14 00:58:14.401000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:14.399257 systemd[1]: sysroot-boot.mount: Deactivated successfully. Jan 14 00:58:14.402000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:14.399775 systemd[1]: sysroot-boot.service: Deactivated successfully. Jan 14 00:58:14.399856 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Jan 14 00:58:14.400640 systemd[1]: initrd-setup-root.service: Deactivated successfully. Jan 14 00:58:14.400718 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Jan 14 00:58:14.401640 systemd[1]: systemd-resolved.service: Deactivated successfully. Jan 14 00:58:14.401736 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Jan 14 00:58:14.404000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:14.404170 systemd[1]: systemd-networkd.service: Deactivated successfully. Jan 14 00:58:14.404261 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Jan 14 00:58:14.406000 audit: BPF prog-id=6 op=UNLOAD Jan 14 00:58:14.406951 systemd[1]: Stopped target network-pre.target - Preparation for Network. Jan 14 00:58:14.407726 systemd[1]: systemd-networkd.socket: Deactivated successfully. Jan 14 00:58:14.407766 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Jan 14 00:58:14.408000 audit: BPF prog-id=9 op=UNLOAD Jan 14 00:58:14.408984 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Jan 14 00:58:14.409345 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Jan 14 00:58:14.411000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:14.409385 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 14 00:58:14.411000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:14.411448 systemd[1]: systemd-sysctl.service: Deactivated successfully. Jan 14 00:58:14.413000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:14.411484 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Jan 14 00:58:14.412004 systemd[1]: systemd-modules-load.service: Deactivated successfully. Jan 14 00:58:14.412034 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Jan 14 00:58:14.413529 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 14 00:58:14.423456 systemd[1]: systemd-udevd.service: Deactivated successfully. Jan 14 00:58:14.423564 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 14 00:58:14.425000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:14.426214 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Jan 14 00:58:14.426282 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Jan 14 00:58:14.427846 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Jan 14 00:58:14.427881 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Jan 14 00:58:14.428000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:14.428000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:14.428227 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Jan 14 00:58:14.428263 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Jan 14 00:58:14.428672 systemd[1]: dracut-cmdline.service: Deactivated successfully. Jan 14 00:58:14.430000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:14.428707 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Jan 14 00:58:14.430242 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jan 14 00:58:14.430286 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 14 00:58:14.432172 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Jan 14 00:58:14.433128 systemd[1]: systemd-network-generator.service: Deactivated successfully. Jan 14 00:58:14.433000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:14.433175 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Jan 14 00:58:14.435119 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Jan 14 00:58:14.435156 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 14 00:58:14.436000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:14.436138 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Jan 14 00:58:14.436174 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 14 00:58:14.436000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:14.436984 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Jan 14 00:58:14.437000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:14.437021 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Jan 14 00:58:14.438000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:14.437675 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 14 00:58:14.437711 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 14 00:58:14.448327 systemd[1]: network-cleanup.service: Deactivated successfully. Jan 14 00:58:14.448422 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Jan 14 00:58:14.449000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=network-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:14.450294 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Jan 14 00:58:14.450374 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Jan 14 00:58:14.451000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:14.451000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:14.451420 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Jan 14 00:58:14.452600 systemd[1]: Starting initrd-switch-root.service - Switch Root... Jan 14 00:58:14.470709 systemd[1]: Switching root. Jan 14 00:58:14.512626 systemd-journald[342]: Journal stopped Jan 14 00:58:16.412799 systemd-journald[342]: Received SIGTERM from PID 1 (systemd). Jan 14 00:58:16.412871 kernel: SELinux: policy capability network_peer_controls=1 Jan 14 00:58:16.412890 kernel: SELinux: policy capability open_perms=1 Jan 14 00:58:16.412901 kernel: SELinux: policy capability extended_socket_class=1 Jan 14 00:58:16.412916 kernel: SELinux: policy capability always_check_network=0 Jan 14 00:58:16.412928 kernel: SELinux: policy capability cgroup_seclabel=1 Jan 14 00:58:16.412941 kernel: SELinux: policy capability nnp_nosuid_transition=1 Jan 14 00:58:16.412963 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Jan 14 00:58:16.412974 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Jan 14 00:58:16.412985 kernel: SELinux: policy capability userspace_initial_context=0 Jan 14 00:58:16.412997 systemd[1]: Successfully loaded SELinux policy in 55.292ms. Jan 14 00:58:16.413018 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 5.331ms. Jan 14 00:58:16.413032 systemd[1]: systemd 257.9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +IPE +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -BTF -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jan 14 00:58:16.413044 systemd[1]: Detected virtualization kvm. Jan 14 00:58:16.413456 systemd[1]: Detected architecture x86-64. Jan 14 00:58:16.413475 systemd[1]: Detected first boot. Jan 14 00:58:16.413487 systemd[1]: Hostname set to . Jan 14 00:58:16.413500 systemd[1]: Initializing machine ID from SMBIOS/DMI UUID. Jan 14 00:58:16.413511 zram_generator::config[1206]: No configuration found. Jan 14 00:58:16.413528 kernel: Guest personality initialized and is inactive Jan 14 00:58:16.413541 kernel: VMCI host device registered (name=vmci, major=10, minor=258) Jan 14 00:58:16.413556 kernel: Initialized host personality Jan 14 00:58:16.413567 kernel: NET: Registered PF_VSOCK protocol family Jan 14 00:58:16.413579 systemd[1]: Populated /etc with preset unit settings. Jan 14 00:58:16.413591 systemd[1]: initrd-switch-root.service: Deactivated successfully. Jan 14 00:58:16.413601 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Jan 14 00:58:16.413615 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Jan 14 00:58:16.413634 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Jan 14 00:58:16.413647 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Jan 14 00:58:16.413658 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Jan 14 00:58:16.413670 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Jan 14 00:58:16.413682 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Jan 14 00:58:16.413694 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Jan 14 00:58:16.413707 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Jan 14 00:58:16.413776 systemd[1]: Created slice user.slice - User and Session Slice. Jan 14 00:58:16.413788 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 14 00:58:16.413800 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 14 00:58:16.413811 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Jan 14 00:58:16.413822 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Jan 14 00:58:16.413834 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Jan 14 00:58:16.413849 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 14 00:58:16.413861 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Jan 14 00:58:16.413872 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 14 00:58:16.413883 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 14 00:58:16.413894 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Jan 14 00:58:16.413911 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Jan 14 00:58:16.413923 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Jan 14 00:58:16.413934 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Jan 14 00:58:16.413945 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 14 00:58:16.413957 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 14 00:58:16.413968 systemd[1]: Reached target remote-veritysetup.target - Remote Verity Protected Volumes. Jan 14 00:58:16.413980 systemd[1]: Reached target slices.target - Slice Units. Jan 14 00:58:16.413992 systemd[1]: Reached target swap.target - Swaps. Jan 14 00:58:16.414004 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Jan 14 00:58:16.414015 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Jan 14 00:58:16.414026 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Jan 14 00:58:16.414037 systemd[1]: Listening on systemd-journald-audit.socket - Journal Audit Socket. Jan 14 00:58:16.414061 systemd[1]: Listening on systemd-mountfsd.socket - DDI File System Mounter Socket. Jan 14 00:58:16.414074 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 14 00:58:16.417422 systemd[1]: Listening on systemd-nsresourced.socket - Namespace Resource Manager Socket. Jan 14 00:58:16.417443 systemd[1]: Listening on systemd-oomd.socket - Userspace Out-Of-Memory (OOM) Killer Socket. Jan 14 00:58:16.417456 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 14 00:58:16.417467 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 14 00:58:16.417479 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Jan 14 00:58:16.417489 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Jan 14 00:58:16.417500 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Jan 14 00:58:16.417514 systemd[1]: Mounting media.mount - External Media Directory... Jan 14 00:58:16.417525 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 14 00:58:16.417535 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Jan 14 00:58:16.417547 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Jan 14 00:58:16.417557 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Jan 14 00:58:16.417568 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Jan 14 00:58:16.417579 systemd[1]: Reached target machines.target - Containers. Jan 14 00:58:16.417591 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Jan 14 00:58:16.417603 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 14 00:58:16.417613 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 14 00:58:16.417625 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Jan 14 00:58:16.417636 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 14 00:58:16.417647 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 14 00:58:16.417659 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 14 00:58:16.417671 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Jan 14 00:58:16.417682 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 14 00:58:16.417693 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Jan 14 00:58:16.417707 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Jan 14 00:58:16.417718 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Jan 14 00:58:16.417729 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Jan 14 00:58:16.419950 systemd[1]: Stopped systemd-fsck-usr.service. Jan 14 00:58:16.419969 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 14 00:58:16.419981 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 14 00:58:16.419992 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 14 00:58:16.420007 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jan 14 00:58:16.420018 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Jan 14 00:58:16.420029 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Jan 14 00:58:16.420040 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 14 00:58:16.420107 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 14 00:58:16.420119 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Jan 14 00:58:16.420129 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Jan 14 00:58:16.420140 systemd[1]: Mounted media.mount - External Media Directory. Jan 14 00:58:16.420150 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Jan 14 00:58:16.420161 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Jan 14 00:58:16.420171 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Jan 14 00:58:16.420183 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 14 00:58:16.420194 systemd[1]: modprobe@configfs.service: Deactivated successfully. Jan 14 00:58:16.420204 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Jan 14 00:58:16.420216 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 14 00:58:16.420226 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 14 00:58:16.420237 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 14 00:58:16.420247 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 14 00:58:16.420260 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 14 00:58:16.420270 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 14 00:58:16.420280 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 14 00:58:16.420292 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jan 14 00:58:16.420302 systemd[1]: Reached target network-pre.target - Preparation for Network. Jan 14 00:58:16.420312 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Jan 14 00:58:16.420323 kernel: ACPI: bus type drm_connector registered Jan 14 00:58:16.420337 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 14 00:58:16.420348 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 14 00:58:16.420359 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jan 14 00:58:16.420369 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Jan 14 00:58:16.420384 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 14 00:58:16.420402 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 14 00:58:16.420413 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Jan 14 00:58:16.420423 systemd[1]: Listening on systemd-importd.socket - Disk Image Download Service Socket. Jan 14 00:58:16.420435 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Jan 14 00:58:16.420446 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 14 00:58:16.420458 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Jan 14 00:58:16.420469 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 14 00:58:16.420484 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Jan 14 00:58:16.420495 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Jan 14 00:58:16.420507 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 14 00:58:16.420518 kernel: fuse: init (API version 7.41) Jan 14 00:58:16.420528 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Jan 14 00:58:16.420540 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Jan 14 00:58:16.420573 systemd-journald[1281]: Collecting audit messages is enabled. Jan 14 00:58:16.420596 systemd[1]: modprobe@fuse.service: Deactivated successfully. Jan 14 00:58:16.420609 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Jan 14 00:58:16.420620 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Jan 14 00:58:16.420631 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 14 00:58:16.420642 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Jan 14 00:58:16.420653 systemd-journald[1281]: Journal started Jan 14 00:58:16.420675 systemd-journald[1281]: Runtime Journal (/run/log/journal/48d68fca6aec498b9f16c652f99d0a71) is 8M, max 77.9M, 69.9M free. Jan 14 00:58:16.109000 audit[1]: EVENT_LISTENER pid=1 uid=0 auid=4294967295 tty=(none) ses=4294967295 subj=system_u:system_r:kernel_t:s0 comm="systemd" exe="/usr/lib/systemd/systemd" nl-mcgrp=1 op=connect res=1 Jan 14 00:58:16.220000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:16.224000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck-usr comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:16.228000 audit: BPF prog-id=14 op=UNLOAD Jan 14 00:58:16.228000 audit: BPF prog-id=13 op=UNLOAD Jan 14 00:58:16.229000 audit: BPF prog-id=15 op=LOAD Jan 14 00:58:16.229000 audit: BPF prog-id=16 op=LOAD Jan 14 00:58:16.229000 audit: BPF prog-id=17 op=LOAD Jan 14 00:58:16.289000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:16.293000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:16.294000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:16.299000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:16.299000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:16.304000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:16.304000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:16.309000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:16.309000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:16.313000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:16.317000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:16.351000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-remount-fs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:16.355000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:16.355000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:16.408000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:16.408000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:16.408000 audit: CONFIG_CHANGE op=set audit_enabled=1 old=1 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 res=1 Jan 14 00:58:16.408000 audit[1281]: SYSCALL arch=c000003e syscall=46 success=yes exit=60 a0=3 a1=7ffe351e6f60 a2=4000 a3=0 items=0 ppid=1 pid=1281 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="systemd-journal" exe="/usr/lib/systemd/systemd-journald" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:58:16.408000 audit: PROCTITLE proctitle="/usr/lib/systemd/systemd-journald" Jan 14 00:58:16.416000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-load-credentials comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:16.419000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:16.049387 systemd[1]: Queued start job for default target multi-user.target. Jan 14 00:58:16.054833 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Jan 14 00:58:16.055269 systemd[1]: systemd-journald.service: Deactivated successfully. Jan 14 00:58:16.423000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=flatcar-tmpfiles comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:16.425079 systemd[1]: Started systemd-journald.service - Journal Service. Jan 14 00:58:16.425000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:16.429414 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Jan 14 00:58:16.429000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-random-seed comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:16.437069 kernel: loop1: detected capacity change from 0 to 219144 Jan 14 00:58:16.441449 systemd-tmpfiles[1304]: ACLs are not supported, ignoring. Jan 14 00:58:16.441462 systemd-tmpfiles[1304]: ACLs are not supported, ignoring. Jan 14 00:58:16.443320 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Jan 14 00:58:16.446292 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Jan 14 00:58:16.448194 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Jan 14 00:58:16.449208 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 14 00:58:16.450000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:16.456173 systemd[1]: Starting systemd-sysusers.service - Create System Users... Jan 14 00:58:16.469139 systemd-journald[1281]: Time spent on flushing to /var/log/journal/48d68fca6aec498b9f16c652f99d0a71 is 444.968ms for 1856 entries. Jan 14 00:58:16.469139 systemd-journald[1281]: System Journal (/var/log/journal/48d68fca6aec498b9f16c652f99d0a71) is 8M, max 588.1M, 580.1M free. Jan 14 00:58:17.126036 systemd-journald[1281]: Received client request to flush runtime journal. Jan 14 00:58:17.126110 kernel: loop2: detected capacity change from 0 to 111560 Jan 14 00:58:16.485000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:16.747000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysusers comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:16.748000 audit: BPF prog-id=18 op=LOAD Jan 14 00:58:16.748000 audit: BPF prog-id=19 op=LOAD Jan 14 00:58:16.748000 audit: BPF prog-id=20 op=LOAD Jan 14 00:58:16.752000 audit: BPF prog-id=21 op=LOAD Jan 14 00:58:16.768000 audit: BPF prog-id=22 op=LOAD Jan 14 00:58:16.769000 audit: BPF prog-id=23 op=LOAD Jan 14 00:58:16.769000 audit: BPF prog-id=24 op=LOAD Jan 14 00:58:16.772000 audit: BPF prog-id=25 op=LOAD Jan 14 00:58:16.772000 audit: BPF prog-id=26 op=LOAD Jan 14 00:58:16.772000 audit: BPF prog-id=27 op=LOAD Jan 14 00:58:16.791000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:16.838000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-userdbd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:16.838000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-nsresourced comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:16.922000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-oomd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:16.976000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:17.127000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-flush comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:16.485301 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 14 00:58:16.747185 systemd[1]: Finished systemd-sysusers.service - Create System Users. Jan 14 00:58:16.750184 systemd[1]: Starting systemd-oomd.service - Userspace Out-Of-Memory (OOM) Killer... Jan 14 00:58:16.753259 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 14 00:58:16.755353 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 14 00:58:16.771178 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Jan 14 00:58:16.773440 systemd[1]: Starting systemd-nsresourced.service - Namespace Resource Manager... Jan 14 00:58:16.787375 systemd-tmpfiles[1351]: ACLs are not supported, ignoring. Jan 14 00:58:16.787386 systemd-tmpfiles[1351]: ACLs are not supported, ignoring. Jan 14 00:58:16.789887 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 14 00:58:16.836819 systemd[1]: Started systemd-userdbd.service - User Database Manager. Jan 14 00:58:16.837479 systemd-nsresourced[1354]: Not setting up BPF subsystem, as functionality has been disabled at compile time. Jan 14 00:58:16.838478 systemd[1]: Started systemd-nsresourced.service - Namespace Resource Manager. Jan 14 00:58:16.919674 systemd-oomd[1349]: No swap; memory pressure usage will be degraded Jan 14 00:58:16.922157 systemd[1]: Started systemd-oomd.service - Userspace Out-Of-Memory (OOM) Killer. Jan 14 00:58:16.935752 systemd-resolved[1350]: Positive Trust Anchors: Jan 14 00:58:16.935761 systemd-resolved[1350]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 14 00:58:16.935765 systemd-resolved[1350]: . IN DS 38696 8 2 683d2d0acb8c9b712a1948b27f741219298d0a450d612c483af444a4c0fb2b16 Jan 14 00:58:16.935797 systemd-resolved[1350]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 14 00:58:17.142000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-hwdb-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:17.142000 audit: BPF prog-id=8 op=UNLOAD Jan 14 00:58:17.142000 audit: BPF prog-id=7 op=UNLOAD Jan 14 00:58:16.974493 systemd-resolved[1350]: Using system hostname 'ci-4547-0-0-n-de0c74fc75'. Jan 14 00:58:16.976200 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 14 00:58:16.976831 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 14 00:58:17.057564 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Jan 14 00:58:17.069517 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Jan 14 00:58:17.126900 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Jan 14 00:58:17.141112 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Jan 14 00:58:17.143000 audit: BPF prog-id=28 op=LOAD Jan 14 00:58:17.143000 audit: BPF prog-id=29 op=LOAD Jan 14 00:58:17.144295 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 14 00:58:17.152023 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Jan 14 00:58:17.152124 kernel: loop3: detected capacity change from 0 to 50784 Jan 14 00:58:17.153246 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Jan 14 00:58:17.153000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-machine-id-commit comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:17.170381 systemd-udevd[1376]: Using default interface naming scheme 'v257'. Jan 14 00:58:17.230076 kernel: loop4: detected capacity change from 0 to 1656 Jan 14 00:58:17.262000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:17.262091 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 14 00:58:17.263000 audit: BPF prog-id=30 op=LOAD Jan 14 00:58:17.265169 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 14 00:58:17.271478 kernel: loop5: detected capacity change from 0 to 219144 Jan 14 00:58:17.325073 kernel: loop6: detected capacity change from 0 to 111560 Jan 14 00:58:17.329038 systemd-networkd[1385]: lo: Link UP Jan 14 00:58:17.329078 systemd-networkd[1385]: lo: Gained carrier Jan 14 00:58:17.329725 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 14 00:58:17.330000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:17.330357 systemd[1]: Reached target network.target - Network. Jan 14 00:58:17.332249 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Jan 14 00:58:17.334131 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Jan 14 00:58:17.353295 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Jan 14 00:58:17.382206 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input4 Jan 14 00:58:17.385657 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Jan 14 00:58:17.386000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-networkd-persistent-storage comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:17.388084 kernel: ACPI: button: Power Button [PWRF] Jan 14 00:58:17.397392 systemd-networkd[1385]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 14 00:58:17.397400 systemd-networkd[1385]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 14 00:58:17.398677 systemd-networkd[1385]: eth0: Link UP Jan 14 00:58:17.398802 systemd-networkd[1385]: eth0: Gained carrier Jan 14 00:58:17.398818 systemd-networkd[1385]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 14 00:58:17.410105 systemd-networkd[1385]: eth0: DHCPv4 address 10.0.21.32/25, gateway 10.0.21.1 acquired from 10.0.21.1 Jan 14 00:58:17.415069 kernel: mousedev: PS/2 mouse device common for all mice Jan 14 00:58:17.445207 kernel: [drm] pci: virtio-vga detected at 0000:00:01.0 Jan 14 00:58:17.445459 kernel: Console: switching to colour dummy device 80x25 Jan 14 00:58:17.445478 kernel: virtio-pci 0000:00:01.0: vgaarb: deactivate vga console Jan 14 00:58:17.446192 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Jan 14 00:58:17.446242 kernel: [drm] features: -context_init Jan 14 00:58:17.455072 kernel: [drm] number of scanouts: 1 Jan 14 00:58:17.459540 kernel: i801_smbus 0000:00:1f.3: Enabling SMBus device Jan 14 00:58:17.459780 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Jan 14 00:58:17.459904 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Jan 14 00:58:17.460029 kernel: [drm] number of cap sets: 0 Jan 14 00:58:17.464065 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:01.0 on minor 0 Jan 14 00:58:17.473521 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device Jan 14 00:58:17.473575 kernel: Console: switching to colour frame buffer device 160x50 Jan 14 00:58:17.478073 kernel: loop7: detected capacity change from 0 to 50784 Jan 14 00:58:17.482070 kernel: virtio-pci 0000:00:01.0: [drm] fb0: virtio_gpudrmfb frame buffer device Jan 14 00:58:17.559924 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 14 00:58:17.575412 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 14 00:58:17.575651 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 14 00:58:17.577000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:17.577000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:17.579071 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 14 00:58:17.608652 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 14 00:58:17.609000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:17.609000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:17.608872 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 14 00:58:17.612253 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 14 00:58:17.613066 kernel: loop1: detected capacity change from 0 to 1656 Jan 14 00:58:17.632028 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Jan 14 00:58:17.635980 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Jan 14 00:58:17.679287 (sd-merge)[1391]: Using extensions 'containerd-flatcar.raw', 'docker-flatcar.raw', 'kubernetes.raw', 'oem-stackit.raw'. Jan 14 00:58:17.681513 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Jan 14 00:58:17.682000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck@dev-disk-by\x2dlabel-OEM comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:17.684906 (sd-merge)[1391]: Merged extensions into '/usr'. Jan 14 00:58:17.691325 systemd[1]: Reload requested from client PID 1316 ('systemd-sysext') (unit systemd-sysext.service)... Jan 14 00:58:17.691337 systemd[1]: Reloading... Jan 14 00:58:17.760078 zram_generator::config[1479]: No configuration found. Jan 14 00:58:17.932560 systemd[1]: Reloading finished in 240 ms. Jan 14 00:58:17.949016 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Jan 14 00:58:17.949000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysext comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:17.949727 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 14 00:58:17.949000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:17.962207 systemd[1]: Starting ensure-sysext.service... Jan 14 00:58:17.963266 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 14 00:58:17.964000 audit: BPF prog-id=31 op=LOAD Jan 14 00:58:17.964000 audit: BPF prog-id=18 op=UNLOAD Jan 14 00:58:17.964000 audit: BPF prog-id=32 op=LOAD Jan 14 00:58:17.964000 audit: BPF prog-id=33 op=LOAD Jan 14 00:58:17.964000 audit: BPF prog-id=19 op=UNLOAD Jan 14 00:58:17.964000 audit: BPF prog-id=20 op=UNLOAD Jan 14 00:58:17.965000 audit: BPF prog-id=34 op=LOAD Jan 14 00:58:17.965000 audit: BPF prog-id=21 op=UNLOAD Jan 14 00:58:17.965000 audit: BPF prog-id=35 op=LOAD Jan 14 00:58:17.965000 audit: BPF prog-id=22 op=UNLOAD Jan 14 00:58:17.965000 audit: BPF prog-id=36 op=LOAD Jan 14 00:58:17.965000 audit: BPF prog-id=37 op=LOAD Jan 14 00:58:17.965000 audit: BPF prog-id=23 op=UNLOAD Jan 14 00:58:17.965000 audit: BPF prog-id=24 op=UNLOAD Jan 14 00:58:17.966000 audit: BPF prog-id=38 op=LOAD Jan 14 00:58:17.966000 audit: BPF prog-id=15 op=UNLOAD Jan 14 00:58:17.966000 audit: BPF prog-id=39 op=LOAD Jan 14 00:58:17.966000 audit: BPF prog-id=40 op=LOAD Jan 14 00:58:17.966000 audit: BPF prog-id=16 op=UNLOAD Jan 14 00:58:17.966000 audit: BPF prog-id=17 op=UNLOAD Jan 14 00:58:17.967000 audit: BPF prog-id=41 op=LOAD Jan 14 00:58:17.970000 audit: BPF prog-id=30 op=UNLOAD Jan 14 00:58:17.971000 audit: BPF prog-id=42 op=LOAD Jan 14 00:58:17.971000 audit: BPF prog-id=43 op=LOAD Jan 14 00:58:17.971000 audit: BPF prog-id=28 op=UNLOAD Jan 14 00:58:17.971000 audit: BPF prog-id=29 op=UNLOAD Jan 14 00:58:17.971000 audit: BPF prog-id=44 op=LOAD Jan 14 00:58:17.971000 audit: BPF prog-id=25 op=UNLOAD Jan 14 00:58:17.971000 audit: BPF prog-id=45 op=LOAD Jan 14 00:58:17.971000 audit: BPF prog-id=46 op=LOAD Jan 14 00:58:17.971000 audit: BPF prog-id=26 op=UNLOAD Jan 14 00:58:17.971000 audit: BPF prog-id=27 op=UNLOAD Jan 14 00:58:17.978337 systemd[1]: Reload requested from client PID 1522 ('systemctl') (unit ensure-sysext.service)... Jan 14 00:58:17.978349 systemd[1]: Reloading... Jan 14 00:58:17.985342 systemd-tmpfiles[1523]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Jan 14 00:58:17.985367 systemd-tmpfiles[1523]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Jan 14 00:58:17.985560 systemd-tmpfiles[1523]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Jan 14 00:58:17.986610 systemd-tmpfiles[1523]: ACLs are not supported, ignoring. Jan 14 00:58:17.986673 systemd-tmpfiles[1523]: ACLs are not supported, ignoring. Jan 14 00:58:17.993722 systemd-tmpfiles[1523]: Detected autofs mount point /boot during canonicalization of boot. Jan 14 00:58:17.993730 systemd-tmpfiles[1523]: Skipping /boot Jan 14 00:58:18.005547 systemd-tmpfiles[1523]: Detected autofs mount point /boot during canonicalization of boot. Jan 14 00:58:18.005559 systemd-tmpfiles[1523]: Skipping /boot Jan 14 00:58:18.025079 zram_generator::config[1553]: No configuration found. Jan 14 00:58:18.203901 systemd[1]: Reloading finished in 225 ms. Jan 14 00:58:18.218000 audit: BPF prog-id=47 op=LOAD Jan 14 00:58:18.219630 kernel: kauditd_printk_skb: 153 callbacks suppressed Jan 14 00:58:18.219672 kernel: audit: type=1334 audit(1768352298.218:201): prog-id=47 op=LOAD Jan 14 00:58:18.219000 audit: BPF prog-id=38 op=UNLOAD Jan 14 00:58:18.219000 audit: BPF prog-id=48 op=LOAD Jan 14 00:58:18.222805 kernel: audit: type=1334 audit(1768352298.219:202): prog-id=38 op=UNLOAD Jan 14 00:58:18.222857 kernel: audit: type=1334 audit(1768352298.219:203): prog-id=48 op=LOAD Jan 14 00:58:18.219000 audit: BPF prog-id=49 op=LOAD Jan 14 00:58:18.224063 kernel: audit: type=1334 audit(1768352298.219:204): prog-id=49 op=LOAD Jan 14 00:58:18.224097 kernel: audit: type=1334 audit(1768352298.219:205): prog-id=39 op=UNLOAD Jan 14 00:58:18.219000 audit: BPF prog-id=39 op=UNLOAD Jan 14 00:58:18.219000 audit: BPF prog-id=40 op=UNLOAD Jan 14 00:58:18.220000 audit: BPF prog-id=50 op=LOAD Jan 14 00:58:18.226336 kernel: audit: type=1334 audit(1768352298.219:206): prog-id=40 op=UNLOAD Jan 14 00:58:18.226363 kernel: audit: type=1334 audit(1768352298.220:207): prog-id=50 op=LOAD Jan 14 00:58:18.220000 audit: BPF prog-id=34 op=UNLOAD Jan 14 00:58:18.227271 kernel: audit: type=1334 audit(1768352298.220:208): prog-id=34 op=UNLOAD Jan 14 00:58:18.220000 audit: BPF prog-id=51 op=LOAD Jan 14 00:58:18.220000 audit: BPF prog-id=52 op=LOAD Jan 14 00:58:18.231293 kernel: audit: type=1334 audit(1768352298.220:209): prog-id=51 op=LOAD Jan 14 00:58:18.231328 kernel: audit: type=1334 audit(1768352298.220:210): prog-id=52 op=LOAD Jan 14 00:58:18.220000 audit: BPF prog-id=42 op=UNLOAD Jan 14 00:58:18.220000 audit: BPF prog-id=43 op=UNLOAD Jan 14 00:58:18.223000 audit: BPF prog-id=53 op=LOAD Jan 14 00:58:18.223000 audit: BPF prog-id=35 op=UNLOAD Jan 14 00:58:18.223000 audit: BPF prog-id=54 op=LOAD Jan 14 00:58:18.227000 audit: BPF prog-id=55 op=LOAD Jan 14 00:58:18.227000 audit: BPF prog-id=36 op=UNLOAD Jan 14 00:58:18.227000 audit: BPF prog-id=37 op=UNLOAD Jan 14 00:58:18.228000 audit: BPF prog-id=56 op=LOAD Jan 14 00:58:18.228000 audit: BPF prog-id=41 op=UNLOAD Jan 14 00:58:18.229000 audit: BPF prog-id=57 op=LOAD Jan 14 00:58:18.229000 audit: BPF prog-id=31 op=UNLOAD Jan 14 00:58:18.229000 audit: BPF prog-id=58 op=LOAD Jan 14 00:58:18.229000 audit: BPF prog-id=59 op=LOAD Jan 14 00:58:18.229000 audit: BPF prog-id=32 op=UNLOAD Jan 14 00:58:18.229000 audit: BPF prog-id=33 op=UNLOAD Jan 14 00:58:18.229000 audit: BPF prog-id=60 op=LOAD Jan 14 00:58:18.229000 audit: BPF prog-id=44 op=UNLOAD Jan 14 00:58:18.229000 audit: BPF prog-id=61 op=LOAD Jan 14 00:58:18.229000 audit: BPF prog-id=62 op=LOAD Jan 14 00:58:18.229000 audit: BPF prog-id=45 op=UNLOAD Jan 14 00:58:18.229000 audit: BPF prog-id=46 op=UNLOAD Jan 14 00:58:18.233721 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 14 00:58:18.235000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:18.242627 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 14 00:58:18.246567 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Jan 14 00:58:18.251538 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Jan 14 00:58:18.254244 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Jan 14 00:58:18.257320 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Jan 14 00:58:18.262249 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 14 00:58:18.262395 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 14 00:58:18.263678 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 14 00:58:18.267244 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 14 00:58:18.268626 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 14 00:58:18.270331 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 14 00:58:18.270523 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Jan 14 00:58:18.270605 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 14 00:58:18.270682 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 14 00:58:18.275150 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 14 00:58:18.275289 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 14 00:58:18.275424 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 14 00:58:18.275537 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Jan 14 00:58:18.275603 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 14 00:58:18.275672 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 14 00:58:18.281171 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 14 00:58:18.281364 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 14 00:58:18.283293 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 14 00:58:18.291945 systemd[1]: Starting modprobe@ptp_kvm.service - Load Kernel Module ptp_kvm... Jan 14 00:58:18.293377 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 14 00:58:18.293546 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Jan 14 00:58:18.293631 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 14 00:58:18.293765 systemd[1]: Reached target time-set.target - System Time Set. Jan 14 00:58:18.295627 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 14 00:58:18.297678 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 14 00:58:18.298699 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 14 00:58:18.300000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:18.300000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:18.300692 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 14 00:58:18.301728 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 14 00:58:18.304663 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 14 00:58:18.304000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:18.304000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:18.305394 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 14 00:58:18.307000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:18.307000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:18.313120 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 14 00:58:18.316634 systemd[1]: Finished ensure-sysext.service. Jan 14 00:58:18.318000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=ensure-sysext comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:18.319000 audit[1605]: SYSTEM_BOOT pid=1605 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg=' comm="systemd-update-utmp" exe="/usr/lib/systemd/systemd-update-utmp" hostname=? addr=? terminal=? res=success' Jan 14 00:58:18.328000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-update-utmp comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:18.327089 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Jan 14 00:58:18.335632 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 14 00:58:18.335830 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 14 00:58:18.347752 kernel: pps_core: LinuxPPS API ver. 1 registered Jan 14 00:58:18.347810 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Jan 14 00:58:18.343000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:18.343000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:18.344713 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 14 00:58:18.353066 kernel: PTP clock support registered Jan 14 00:58:18.357758 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Jan 14 00:58:18.358000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-catalog-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:58:18.358506 systemd[1]: modprobe@ptp_kvm.service: Deactivated successfully. Jan 14 00:58:18.359212 systemd[1]: Finished modprobe@ptp_kvm.service - Load Kernel Module ptp_kvm. Jan 14 00:58:18.359000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=add_rule key=(null) list=5 res=1 Jan 14 00:58:18.359000 audit[1636]: SYSCALL arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7fff10b92320 a2=420 a3=0 items=0 ppid=1601 pid=1636 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:58:18.359000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 14 00:58:18.359909 augenrules[1636]: No rules Jan 14 00:58:18.363320 systemd[1]: audit-rules.service: Deactivated successfully. Jan 14 00:58:18.363530 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 14 00:58:18.437709 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Jan 14 00:58:18.438761 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Jan 14 00:58:18.774089 systemd-networkd[1385]: eth0: Gained IPv6LL Jan 14 00:58:18.776264 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Jan 14 00:58:18.777867 systemd[1]: Reached target network-online.target - Network is Online. Jan 14 00:58:19.002346 ldconfig[1603]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Jan 14 00:58:19.008810 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Jan 14 00:58:19.011173 systemd[1]: Starting systemd-update-done.service - Update is Completed... Jan 14 00:58:19.027760 systemd[1]: Finished systemd-update-done.service - Update is Completed. Jan 14 00:58:19.028841 systemd[1]: Reached target sysinit.target - System Initialization. Jan 14 00:58:19.029978 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Jan 14 00:58:19.030326 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Jan 14 00:58:19.030647 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Jan 14 00:58:19.031100 systemd[1]: Started logrotate.timer - Daily rotation of log files. Jan 14 00:58:19.031481 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Jan 14 00:58:19.031807 systemd[1]: Started systemd-sysupdate-reboot.timer - Reboot Automatically After System Update. Jan 14 00:58:19.033867 systemd[1]: Started systemd-sysupdate.timer - Automatic System Update. Jan 14 00:58:19.034527 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Jan 14 00:58:19.034846 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Jan 14 00:58:19.034872 systemd[1]: Reached target paths.target - Path Units. Jan 14 00:58:19.035176 systemd[1]: Reached target timers.target - Timer Units. Jan 14 00:58:19.036871 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Jan 14 00:58:19.038340 systemd[1]: Starting docker.socket - Docker Socket for the API... Jan 14 00:58:19.042543 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Jan 14 00:58:19.044604 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Jan 14 00:58:19.044933 systemd[1]: Reached target ssh-access.target - SSH Access Available. Jan 14 00:58:19.055673 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Jan 14 00:58:19.059538 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Jan 14 00:58:19.060598 systemd[1]: Listening on docker.socket - Docker Socket for the API. Jan 14 00:58:19.062973 systemd[1]: Reached target sockets.target - Socket Units. Jan 14 00:58:19.064349 systemd[1]: Reached target basic.target - Basic System. Jan 14 00:58:19.064706 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Jan 14 00:58:19.064730 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Jan 14 00:58:19.067848 systemd[1]: Starting chronyd.service - NTP client/server... Jan 14 00:58:19.072162 systemd[1]: Starting containerd.service - containerd container runtime... Jan 14 00:58:19.078407 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Jan 14 00:58:19.081216 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Jan 14 00:58:19.084177 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Jan 14 00:58:19.087218 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Jan 14 00:58:19.092919 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Jan 14 00:58:19.093676 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Jan 14 00:58:19.097163 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Jan 14 00:58:19.101142 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 14 00:58:19.103287 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Jan 14 00:58:19.111421 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 14 00:58:19.112318 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Jan 14 00:58:19.117105 google_oslogin_nss_cache[1658]: oslogin_cache_refresh[1658]: Refreshing passwd entry cache Jan 14 00:58:19.116621 oslogin_cache_refresh[1658]: Refreshing passwd entry cache Jan 14 00:58:19.123133 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Jan 14 00:58:19.126735 oslogin_cache_refresh[1658]: Failure getting users, quitting Jan 14 00:58:19.127240 google_oslogin_nss_cache[1658]: oslogin_cache_refresh[1658]: Failure getting users, quitting Jan 14 00:58:19.127240 google_oslogin_nss_cache[1658]: oslogin_cache_refresh[1658]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Jan 14 00:58:19.127240 google_oslogin_nss_cache[1658]: oslogin_cache_refresh[1658]: Refreshing group entry cache Jan 14 00:58:19.126751 oslogin_cache_refresh[1658]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Jan 14 00:58:19.126793 oslogin_cache_refresh[1658]: Refreshing group entry cache Jan 14 00:58:19.128950 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Jan 14 00:58:19.136661 google_oslogin_nss_cache[1658]: oslogin_cache_refresh[1658]: Failure getting groups, quitting Jan 14 00:58:19.136661 google_oslogin_nss_cache[1658]: oslogin_cache_refresh[1658]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Jan 14 00:58:19.134560 oslogin_cache_refresh[1658]: Failure getting groups, quitting Jan 14 00:58:19.134572 oslogin_cache_refresh[1658]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Jan 14 00:58:19.141221 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Jan 14 00:58:19.145769 jq[1656]: false Jan 14 00:58:19.145250 systemd[1]: Starting systemd-logind.service - User Login Management... Jan 14 00:58:19.146757 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Jan 14 00:58:19.148306 chronyd[1651]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER -SIGND +NTS +SECHASH +IPV6 -DEBUG) Jan 14 00:58:19.151283 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Jan 14 00:58:19.155234 systemd[1]: Starting update-engine.service - Update Engine... Jan 14 00:58:19.158356 chronyd[1651]: Loaded seccomp filter (level 2) Jan 14 00:58:19.159361 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Jan 14 00:58:19.161454 systemd[1]: Started chronyd.service - NTP client/server. Jan 14 00:58:19.169483 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Jan 14 00:58:19.170182 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Jan 14 00:58:19.170433 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Jan 14 00:58:19.170659 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Jan 14 00:58:19.170827 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Jan 14 00:58:19.175212 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Jan 14 00:58:19.175414 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Jan 14 00:58:19.182593 extend-filesystems[1657]: Found /dev/vda6 Jan 14 00:58:19.196443 extend-filesystems[1657]: Found /dev/vda9 Jan 14 00:58:19.213749 extend-filesystems[1657]: Checking size of /dev/vda9 Jan 14 00:58:19.220869 jq[1671]: true Jan 14 00:58:19.257100 update_engine[1668]: I20260114 00:58:19.252255 1668 main.cc:92] Flatcar Update Engine starting Jan 14 00:58:19.254778 systemd[1]: motdgen.service: Deactivated successfully. Jan 14 00:58:19.258123 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Jan 14 00:58:19.266070 tar[1683]: linux-amd64/LICENSE Jan 14 00:58:19.267491 tar[1683]: linux-amd64/helm Jan 14 00:58:19.276842 extend-filesystems[1657]: Resized partition /dev/vda9 Jan 14 00:58:19.282463 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Jan 14 00:58:19.289782 jq[1709]: true Jan 14 00:58:19.290989 extend-filesystems[1716]: resize2fs 1.47.3 (8-Jul-2025) Jan 14 00:58:19.314149 kernel: EXT4-fs (vda9): resizing filesystem from 1617920 to 11516923 blocks Jan 14 00:58:19.316546 dbus-daemon[1654]: [system] SELinux support is enabled Jan 14 00:58:19.316929 systemd[1]: Started dbus.service - D-Bus System Message Bus. Jan 14 00:58:19.332497 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Jan 14 00:58:19.333014 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Jan 14 00:58:19.334005 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Jan 14 00:58:19.334019 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Jan 14 00:58:19.349891 systemd[1]: Started update-engine.service - Update Engine. Jan 14 00:58:19.351919 update_engine[1668]: I20260114 00:58:19.351824 1668 update_check_scheduler.cc:74] Next update check in 6m41s Jan 14 00:58:19.355126 systemd-logind[1667]: New seat seat0. Jan 14 00:58:19.357361 systemd[1]: Started locksmithd.service - Cluster reboot manager. Jan 14 00:58:19.360875 systemd-logind[1667]: Watching system buttons on /dev/input/event3 (Power Button) Jan 14 00:58:19.360897 systemd-logind[1667]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Jan 14 00:58:19.361160 systemd[1]: Started systemd-logind.service - User Login Management. Jan 14 00:58:19.486153 locksmithd[1732]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Jan 14 00:58:19.531863 bash[1736]: Updated "/home/core/.ssh/authorized_keys" Jan 14 00:58:19.533056 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Jan 14 00:58:19.535875 containerd[1695]: time="2026-01-14T00:58:19Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Jan 14 00:58:19.539981 containerd[1695]: time="2026-01-14T00:58:19.539476047Z" level=info msg="starting containerd" revision=fcd43222d6b07379a4be9786bda52438f0dd16a1 version=v2.1.5 Jan 14 00:58:19.540121 sshd_keygen[1705]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Jan 14 00:58:19.542162 systemd[1]: Starting sshkeys.service... Jan 14 00:58:19.564202 containerd[1695]: time="2026-01-14T00:58:19.564162298Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="8.389µs" Jan 14 00:58:19.565637 containerd[1695]: time="2026-01-14T00:58:19.565604788Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Jan 14 00:58:19.565691 containerd[1695]: time="2026-01-14T00:58:19.565662001Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Jan 14 00:58:19.565691 containerd[1695]: time="2026-01-14T00:58:19.565675051Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Jan 14 00:58:19.565814 containerd[1695]: time="2026-01-14T00:58:19.565799173Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Jan 14 00:58:19.565853 containerd[1695]: time="2026-01-14T00:58:19.565822345Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jan 14 00:58:19.566206 containerd[1695]: time="2026-01-14T00:58:19.565869384Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jan 14 00:58:19.566206 containerd[1695]: time="2026-01-14T00:58:19.565891681Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jan 14 00:58:19.566953 containerd[1695]: time="2026-01-14T00:58:19.566334699Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jan 14 00:58:19.566953 containerd[1695]: time="2026-01-14T00:58:19.566352762Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jan 14 00:58:19.566953 containerd[1695]: time="2026-01-14T00:58:19.566362335Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jan 14 00:58:19.566953 containerd[1695]: time="2026-01-14T00:58:19.566380338Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.erofs type=io.containerd.snapshotter.v1 Jan 14 00:58:19.566953 containerd[1695]: time="2026-01-14T00:58:19.566511521Z" level=info msg="skip loading plugin" error="EROFS unsupported, please `modprobe erofs`: skip plugin" id=io.containerd.snapshotter.v1.erofs type=io.containerd.snapshotter.v1 Jan 14 00:58:19.566953 containerd[1695]: time="2026-01-14T00:58:19.566520881Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Jan 14 00:58:19.566953 containerd[1695]: time="2026-01-14T00:58:19.566575115Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Jan 14 00:58:19.566953 containerd[1695]: time="2026-01-14T00:58:19.566712319Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jan 14 00:58:19.566953 containerd[1695]: time="2026-01-14T00:58:19.566734883Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jan 14 00:58:19.566953 containerd[1695]: time="2026-01-14T00:58:19.566742820Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Jan 14 00:58:19.566953 containerd[1695]: time="2026-01-14T00:58:19.566761660Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Jan 14 00:58:19.567185 containerd[1695]: time="2026-01-14T00:58:19.566929961Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Jan 14 00:58:19.567185 containerd[1695]: time="2026-01-14T00:58:19.566971602Z" level=info msg="metadata content store policy set" policy=shared Jan 14 00:58:19.574957 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Jan 14 00:58:19.578942 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Jan 14 00:58:19.591199 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Jan 14 00:58:19.595070 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 14 00:58:19.596397 systemd[1]: Starting issuegen.service - Generate /run/issue... Jan 14 00:58:19.612295 systemd[1]: issuegen.service: Deactivated successfully. Jan 14 00:58:19.612685 systemd[1]: Finished issuegen.service - Generate /run/issue. Jan 14 00:58:19.615527 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Jan 14 00:58:19.620358 containerd[1695]: time="2026-01-14T00:58:19.619329462Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Jan 14 00:58:19.620358 containerd[1695]: time="2026-01-14T00:58:19.619486300Z" level=info msg="loading plugin" id=io.containerd.differ.v1.erofs type=io.containerd.differ.v1 Jan 14 00:58:19.620358 containerd[1695]: time="2026-01-14T00:58:19.619767382Z" level=info msg="skip loading plugin" error="could not find mkfs.erofs: exec: \"mkfs.erofs\": executable file not found in $PATH: skip plugin" id=io.containerd.differ.v1.erofs type=io.containerd.differ.v1 Jan 14 00:58:19.620358 containerd[1695]: time="2026-01-14T00:58:19.619782300Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Jan 14 00:58:19.620358 containerd[1695]: time="2026-01-14T00:58:19.619793579Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Jan 14 00:58:19.620358 containerd[1695]: time="2026-01-14T00:58:19.619805690Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Jan 14 00:58:19.620358 containerd[1695]: time="2026-01-14T00:58:19.619832237Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Jan 14 00:58:19.620358 containerd[1695]: time="2026-01-14T00:58:19.619843326Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Jan 14 00:58:19.620358 containerd[1695]: time="2026-01-14T00:58:19.619859270Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Jan 14 00:58:19.620358 containerd[1695]: time="2026-01-14T00:58:19.619878545Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Jan 14 00:58:19.620358 containerd[1695]: time="2026-01-14T00:58:19.620070321Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Jan 14 00:58:19.620358 containerd[1695]: time="2026-01-14T00:58:19.620082269Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Jan 14 00:58:19.620358 containerd[1695]: time="2026-01-14T00:58:19.620092875Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Jan 14 00:58:19.620358 containerd[1695]: time="2026-01-14T00:58:19.620104156Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Jan 14 00:58:19.622254 containerd[1695]: time="2026-01-14T00:58:19.621595822Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Jan 14 00:58:19.622254 containerd[1695]: time="2026-01-14T00:58:19.621627612Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Jan 14 00:58:19.622254 containerd[1695]: time="2026-01-14T00:58:19.621660610Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Jan 14 00:58:19.622254 containerd[1695]: time="2026-01-14T00:58:19.621673064Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Jan 14 00:58:19.622254 containerd[1695]: time="2026-01-14T00:58:19.621702727Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Jan 14 00:58:19.622254 containerd[1695]: time="2026-01-14T00:58:19.621711835Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Jan 14 00:58:19.622254 containerd[1695]: time="2026-01-14T00:58:19.621732553Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Jan 14 00:58:19.622254 containerd[1695]: time="2026-01-14T00:58:19.621744144Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Jan 14 00:58:19.622254 containerd[1695]: time="2026-01-14T00:58:19.621754587Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Jan 14 00:58:19.622254 containerd[1695]: time="2026-01-14T00:58:19.621764763Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Jan 14 00:58:19.622254 containerd[1695]: time="2026-01-14T00:58:19.621774419Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Jan 14 00:58:19.622254 containerd[1695]: time="2026-01-14T00:58:19.621805899Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Jan 14 00:58:19.622254 containerd[1695]: time="2026-01-14T00:58:19.621843069Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Jan 14 00:58:19.622254 containerd[1695]: time="2026-01-14T00:58:19.621854420Z" level=info msg="Start snapshots syncer" Jan 14 00:58:19.622254 containerd[1695]: time="2026-01-14T00:58:19.621884239Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Jan 14 00:58:19.622528 containerd[1695]: time="2026-01-14T00:58:19.622170327Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"cgroupWritable\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"\",\"binDirs\":[\"/opt/cni/bin\"],\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogLineSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Jan 14 00:58:19.622528 containerd[1695]: time="2026-01-14T00:58:19.622286992Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Jan 14 00:58:19.622640 containerd[1695]: time="2026-01-14T00:58:19.622337774Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Jan 14 00:58:19.622640 containerd[1695]: time="2026-01-14T00:58:19.622443540Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Jan 14 00:58:19.622640 containerd[1695]: time="2026-01-14T00:58:19.622461523Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Jan 14 00:58:19.622640 containerd[1695]: time="2026-01-14T00:58:19.622470419Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Jan 14 00:58:19.622640 containerd[1695]: time="2026-01-14T00:58:19.622490840Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Jan 14 00:58:19.622640 containerd[1695]: time="2026-01-14T00:58:19.622516899Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Jan 14 00:58:19.622640 containerd[1695]: time="2026-01-14T00:58:19.622527896Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Jan 14 00:58:19.622640 containerd[1695]: time="2026-01-14T00:58:19.622537263Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Jan 14 00:58:19.622640 containerd[1695]: time="2026-01-14T00:58:19.622545515Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Jan 14 00:58:19.622640 containerd[1695]: time="2026-01-14T00:58:19.622565022Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Jan 14 00:58:19.622640 containerd[1695]: time="2026-01-14T00:58:19.622592687Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jan 14 00:58:19.622640 containerd[1695]: time="2026-01-14T00:58:19.622605183Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jan 14 00:58:19.622827 containerd[1695]: time="2026-01-14T00:58:19.622646458Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jan 14 00:58:19.622827 containerd[1695]: time="2026-01-14T00:58:19.622655539Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jan 14 00:58:19.622827 containerd[1695]: time="2026-01-14T00:58:19.622662389Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Jan 14 00:58:19.622827 containerd[1695]: time="2026-01-14T00:58:19.622672642Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Jan 14 00:58:19.622827 containerd[1695]: time="2026-01-14T00:58:19.622682419Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Jan 14 00:58:19.622827 containerd[1695]: time="2026-01-14T00:58:19.622693143Z" level=info msg="runtime interface created" Jan 14 00:58:19.622827 containerd[1695]: time="2026-01-14T00:58:19.622697514Z" level=info msg="created NRI interface" Jan 14 00:58:19.624079 containerd[1695]: time="2026-01-14T00:58:19.623003750Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Jan 14 00:58:19.624079 containerd[1695]: time="2026-01-14T00:58:19.623025159Z" level=info msg="Connect containerd service" Jan 14 00:58:19.624079 containerd[1695]: time="2026-01-14T00:58:19.623045437Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Jan 14 00:58:19.624952 containerd[1695]: time="2026-01-14T00:58:19.624396941Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jan 14 00:58:19.646031 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Jan 14 00:58:19.653013 systemd[1]: Started getty@tty1.service - Getty on tty1. Jan 14 00:58:19.657191 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Jan 14 00:58:19.658629 systemd[1]: Reached target getty.target - Login Prompts. Jan 14 00:58:19.768840 containerd[1695]: time="2026-01-14T00:58:19.768134065Z" level=info msg="Start subscribing containerd event" Jan 14 00:58:19.768840 containerd[1695]: time="2026-01-14T00:58:19.768177626Z" level=info msg="Start recovering state" Jan 14 00:58:19.768840 containerd[1695]: time="2026-01-14T00:58:19.768266613Z" level=info msg="Start event monitor" Jan 14 00:58:19.768840 containerd[1695]: time="2026-01-14T00:58:19.768277061Z" level=info msg="Start cni network conf syncer for default" Jan 14 00:58:19.768840 containerd[1695]: time="2026-01-14T00:58:19.768282932Z" level=info msg="Start streaming server" Jan 14 00:58:19.768840 containerd[1695]: time="2026-01-14T00:58:19.768290489Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Jan 14 00:58:19.768840 containerd[1695]: time="2026-01-14T00:58:19.768296495Z" level=info msg="runtime interface starting up..." Jan 14 00:58:19.768840 containerd[1695]: time="2026-01-14T00:58:19.768301651Z" level=info msg="starting plugins..." Jan 14 00:58:19.768840 containerd[1695]: time="2026-01-14T00:58:19.768312104Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Jan 14 00:58:19.770278 containerd[1695]: time="2026-01-14T00:58:19.770156995Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Jan 14 00:58:19.770278 containerd[1695]: time="2026-01-14T00:58:19.770197195Z" level=info msg=serving... address=/run/containerd/containerd.sock Jan 14 00:58:19.770398 systemd[1]: Started containerd.service - containerd container runtime. Jan 14 00:58:19.771698 containerd[1695]: time="2026-01-14T00:58:19.771673132Z" level=info msg="containerd successfully booted in 0.236126s" Jan 14 00:58:19.933608 tar[1683]: linux-amd64/README.md Jan 14 00:58:19.949381 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Jan 14 00:58:20.015119 kernel: EXT4-fs (vda9): resized filesystem to 11516923 Jan 14 00:58:20.152990 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 14 00:58:20.179655 extend-filesystems[1716]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Jan 14 00:58:20.179655 extend-filesystems[1716]: old_desc_blocks = 1, new_desc_blocks = 6 Jan 14 00:58:20.179655 extend-filesystems[1716]: The filesystem on /dev/vda9 is now 11516923 (4k) blocks long. Jan 14 00:58:20.182009 extend-filesystems[1657]: Resized filesystem in /dev/vda9 Jan 14 00:58:20.180396 systemd[1]: extend-filesystems.service: Deactivated successfully. Jan 14 00:58:20.180642 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Jan 14 00:58:20.605076 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 14 00:58:20.912541 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 14 00:58:20.922332 (kubelet)[1792]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 14 00:58:21.563561 kubelet[1792]: E0114 00:58:21.563516 1792 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 14 00:58:21.565612 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 14 00:58:21.565923 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 14 00:58:21.566398 systemd[1]: kubelet.service: Consumed 872ms CPU time, 257.9M memory peak. Jan 14 00:58:22.162091 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 14 00:58:22.616080 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 14 00:58:26.172085 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 14 00:58:26.179095 coreos-metadata[1653]: Jan 14 00:58:26.178 WARN failed to locate config-drive, using the metadata service API instead Jan 14 00:58:26.195159 coreos-metadata[1653]: Jan 14 00:58:26.195 INFO Fetching http://169.254.169.254/openstack/2012-08-10/meta_data.json: Attempt #1 Jan 14 00:58:26.474327 coreos-metadata[1653]: Jan 14 00:58:26.474 INFO Fetch successful Jan 14 00:58:26.474536 coreos-metadata[1653]: Jan 14 00:58:26.474 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Jan 14 00:58:26.609210 coreos-metadata[1653]: Jan 14 00:58:26.609 INFO Fetch successful Jan 14 00:58:26.609210 coreos-metadata[1653]: Jan 14 00:58:26.609 INFO Fetching http://169.254.169.254/latest/meta-data/instance-id: Attempt #1 Jan 14 00:58:26.629094 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 14 00:58:26.638396 coreos-metadata[1755]: Jan 14 00:58:26.638 WARN failed to locate config-drive, using the metadata service API instead Jan 14 00:58:26.650159 coreos-metadata[1755]: Jan 14 00:58:26.650 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 Jan 14 00:58:26.862199 coreos-metadata[1653]: Jan 14 00:58:26.862 INFO Fetch successful Jan 14 00:58:26.862199 coreos-metadata[1653]: Jan 14 00:58:26.862 INFO Fetching http://169.254.169.254/latest/meta-data/instance-type: Attempt #1 Jan 14 00:58:26.864523 coreos-metadata[1755]: Jan 14 00:58:26.864 INFO Fetch successful Jan 14 00:58:26.864523 coreos-metadata[1755]: Jan 14 00:58:26.864 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 Jan 14 00:58:27.077496 coreos-metadata[1653]: Jan 14 00:58:27.077 INFO Fetch successful Jan 14 00:58:27.077496 coreos-metadata[1653]: Jan 14 00:58:27.077 INFO Fetching http://169.254.169.254/latest/meta-data/local-ipv4: Attempt #1 Jan 14 00:58:27.190211 coreos-metadata[1755]: Jan 14 00:58:27.190 INFO Fetch successful Jan 14 00:58:27.192542 unknown[1755]: wrote ssh authorized keys file for user: core Jan 14 00:58:27.198152 coreos-metadata[1653]: Jan 14 00:58:27.198 INFO Fetch successful Jan 14 00:58:27.198428 coreos-metadata[1653]: Jan 14 00:58:27.198 INFO Fetching http://169.254.169.254/latest/meta-data/public-ipv4: Attempt #1 Jan 14 00:58:27.219212 update-ssh-keys[1810]: Updated "/home/core/.ssh/authorized_keys" Jan 14 00:58:27.220358 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Jan 14 00:58:27.223590 systemd[1]: Finished sshkeys.service. Jan 14 00:58:27.623153 coreos-metadata[1653]: Jan 14 00:58:27.623 INFO Fetch successful Jan 14 00:58:27.653974 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Jan 14 00:58:27.654556 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Jan 14 00:58:27.654679 systemd[1]: Reached target multi-user.target - Multi-User System. Jan 14 00:58:27.654807 systemd[1]: Startup finished in 3.555s (kernel) + 12.728s (initrd) + 12.339s (userspace) = 28.623s. Jan 14 00:58:31.584320 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Jan 14 00:58:31.585840 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 14 00:58:31.709292 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 14 00:58:31.715347 (kubelet)[1827]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 14 00:58:32.164635 kubelet[1827]: E0114 00:58:32.164593 1827 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 14 00:58:32.167549 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 14 00:58:32.167657 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 14 00:58:32.167954 systemd[1]: kubelet.service: Consumed 137ms CPU time, 110.2M memory peak. Jan 14 00:58:42.334269 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Jan 14 00:58:42.335653 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 14 00:58:42.461137 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 14 00:58:42.467340 (kubelet)[1842]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 14 00:58:42.935479 kubelet[1842]: E0114 00:58:42.712463 1842 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 14 00:58:42.714273 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 14 00:58:42.714425 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 14 00:58:42.714951 systemd[1]: kubelet.service: Consumed 127ms CPU time, 110.2M memory peak. Jan 14 00:58:42.940708 chronyd[1651]: Selected source PHC0 Jan 14 00:58:52.834655 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Jan 14 00:58:52.836031 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 14 00:58:52.975110 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 14 00:58:52.985489 (kubelet)[1855]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 14 00:58:53.436950 kubelet[1855]: E0114 00:58:53.436898 1855 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 14 00:58:53.439082 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 14 00:58:53.439263 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 14 00:58:53.439856 systemd[1]: kubelet.service: Consumed 128ms CPU time, 108.3M memory peak. Jan 14 00:59:03.584543 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Jan 14 00:59:03.586206 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 14 00:59:03.957140 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 14 00:59:03.960561 (kubelet)[1871]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 14 00:59:03.989613 kubelet[1871]: E0114 00:59:03.989572 1871 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 14 00:59:03.991087 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 14 00:59:03.991206 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 14 00:59:03.991518 systemd[1]: kubelet.service: Consumed 129ms CPU time, 109.8M memory peak. Jan 14 00:59:05.076563 update_engine[1668]: I20260114 00:59:05.076109 1668 update_attempter.cc:509] Updating boot flags... Jan 14 00:59:14.084247 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 5. Jan 14 00:59:14.085911 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 14 00:59:14.268197 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 14 00:59:14.274367 (kubelet)[1903]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 14 00:59:14.518695 kubelet[1903]: E0114 00:59:14.518600 1903 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 14 00:59:14.520562 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 14 00:59:14.520763 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 14 00:59:14.521171 systemd[1]: kubelet.service: Consumed 125ms CPU time, 109.5M memory peak. Jan 14 00:59:24.584244 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 6. Jan 14 00:59:24.586202 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 14 00:59:24.808573 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 14 00:59:24.819473 (kubelet)[1918]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 14 00:59:24.852767 kubelet[1918]: E0114 00:59:24.852677 1918 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 14 00:59:24.854748 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 14 00:59:24.854969 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 14 00:59:24.855532 systemd[1]: kubelet.service: Consumed 132ms CPU time, 109.9M memory peak. Jan 14 00:59:35.084632 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 7. Jan 14 00:59:35.086397 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 14 00:59:35.461768 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 14 00:59:35.471435 (kubelet)[1933]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 14 00:59:35.506577 kubelet[1933]: E0114 00:59:35.506528 1933 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 14 00:59:35.509152 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 14 00:59:35.509285 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 14 00:59:35.509610 systemd[1]: kubelet.service: Consumed 145ms CPU time, 109.9M memory peak. Jan 14 00:59:45.584174 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 8. Jan 14 00:59:45.586204 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 14 00:59:45.802194 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 14 00:59:45.808397 (kubelet)[1947]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 14 00:59:45.845608 kubelet[1947]: E0114 00:59:45.845495 1947 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 14 00:59:45.847365 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 14 00:59:45.847492 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 14 00:59:45.848090 systemd[1]: kubelet.service: Consumed 144ms CPU time, 108.4M memory peak. Jan 14 00:59:56.084381 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 9. Jan 14 00:59:56.086613 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 14 00:59:56.252596 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 14 00:59:56.261412 (kubelet)[1962]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 14 00:59:56.376682 kubelet[1962]: E0114 00:59:56.376596 1962 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 14 00:59:56.378471 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 14 00:59:56.378576 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 14 00:59:56.379179 systemd[1]: kubelet.service: Consumed 208ms CPU time, 108.4M memory peak. Jan 14 01:00:06.584352 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 10. Jan 14 01:00:06.586172 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 14 01:00:06.795262 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 14 01:00:06.806416 (kubelet)[1978]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 14 01:00:06.838111 kubelet[1978]: E0114 01:00:06.838015 1978 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 14 01:00:06.839569 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 14 01:00:06.839679 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 14 01:00:06.840154 systemd[1]: kubelet.service: Consumed 129ms CPU time, 110M memory peak. Jan 14 01:00:17.084478 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 11. Jan 14 01:00:17.086405 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 14 01:00:17.443905 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 14 01:00:17.449356 (kubelet)[1993]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 14 01:00:17.480679 kubelet[1993]: E0114 01:00:17.480642 1993 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 14 01:00:17.482647 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 14 01:00:17.482824 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 14 01:00:17.483336 systemd[1]: kubelet.service: Consumed 131ms CPU time, 110M memory peak. Jan 14 01:00:27.584191 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 12. Jan 14 01:00:27.585831 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 14 01:00:27.864022 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 14 01:00:27.879309 (kubelet)[2008]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 14 01:00:27.911532 kubelet[2008]: E0114 01:00:27.911489 2008 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 14 01:00:27.913560 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 14 01:00:27.913685 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 14 01:00:27.914121 systemd[1]: kubelet.service: Consumed 130ms CPU time, 110M memory peak. Jan 14 01:00:38.084281 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 13. Jan 14 01:00:38.085742 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 14 01:00:38.449172 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 14 01:00:38.459440 (kubelet)[2022]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 14 01:00:38.493991 kubelet[2022]: E0114 01:00:38.493908 2022 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 14 01:00:38.495644 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 14 01:00:38.495768 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 14 01:00:38.496104 systemd[1]: kubelet.service: Consumed 134ms CPU time, 109.4M memory peak. Jan 14 01:00:48.584402 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 14. Jan 14 01:00:48.586136 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 14 01:00:48.860403 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 14 01:00:48.866371 (kubelet)[2036]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 14 01:00:48.899975 kubelet[2036]: E0114 01:00:48.899926 2036 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 14 01:00:48.902000 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 14 01:00:48.902257 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 14 01:00:48.902758 systemd[1]: kubelet.service: Consumed 131ms CPU time, 109.7M memory peak. Jan 14 01:00:59.084291 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 15. Jan 14 01:00:59.087227 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 14 01:00:59.441196 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 14 01:00:59.452415 (kubelet)[2051]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 14 01:00:59.482824 kubelet[2051]: E0114 01:00:59.482786 2051 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 14 01:00:59.484686 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 14 01:00:59.484798 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 14 01:00:59.485268 systemd[1]: kubelet.service: Consumed 128ms CPU time, 108.6M memory peak. Jan 14 01:01:09.584272 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 16. Jan 14 01:01:09.586156 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 14 01:01:09.772529 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 14 01:01:09.778389 (kubelet)[2066]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 14 01:01:09.809627 kubelet[2066]: E0114 01:01:09.809578 2066 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 14 01:01:09.811600 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 14 01:01:09.811722 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 14 01:01:09.812074 systemd[1]: kubelet.service: Consumed 127ms CPU time, 108M memory peak. Jan 14 01:01:19.834817 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 17. Jan 14 01:01:19.836502 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 14 01:01:20.211793 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 14 01:01:20.224431 (kubelet)[2081]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 14 01:01:20.258494 kubelet[2081]: E0114 01:01:20.258418 2081 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 14 01:01:20.260434 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 14 01:01:20.260576 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 14 01:01:20.261093 systemd[1]: kubelet.service: Consumed 140ms CPU time, 108.2M memory peak. Jan 14 01:01:30.334297 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 18. Jan 14 01:01:30.336117 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 14 01:01:30.616199 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 14 01:01:30.619744 (kubelet)[2096]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 14 01:01:30.652385 kubelet[2096]: E0114 01:01:30.652349 2096 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 14 01:01:30.653890 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 14 01:01:30.654131 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 14 01:01:30.654575 systemd[1]: kubelet.service: Consumed 129ms CPU time, 108.2M memory peak. Jan 14 01:01:40.834264 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 19. Jan 14 01:01:40.836171 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 14 01:01:41.213872 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 14 01:01:41.223437 (kubelet)[2111]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 14 01:01:41.254669 kubelet[2111]: E0114 01:01:41.254617 2111 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 14 01:01:41.256576 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 14 01:01:41.256790 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 14 01:01:41.257371 systemd[1]: kubelet.service: Consumed 130ms CPU time, 109.9M memory peak. Jan 14 01:01:51.334449 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 20. Jan 14 01:01:51.336463 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 14 01:01:51.619470 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 14 01:01:51.633530 (kubelet)[2126]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 14 01:01:51.669512 kubelet[2126]: E0114 01:01:51.669476 2126 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 14 01:01:51.671709 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 14 01:01:51.671834 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 14 01:01:51.672345 systemd[1]: kubelet.service: Consumed 134ms CPU time, 109.6M memory peak. Jan 14 01:02:01.834223 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 21. Jan 14 01:02:01.837230 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 14 01:02:02.103424 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 14 01:02:02.112312 (kubelet)[2141]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 14 01:02:02.143606 kubelet[2141]: E0114 01:02:02.143561 2141 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 14 01:02:02.145544 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 14 01:02:02.145665 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 14 01:02:02.146300 systemd[1]: kubelet.service: Consumed 130ms CPU time, 109.7M memory peak. Jan 14 01:02:12.334407 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 22. Jan 14 01:02:12.336165 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 14 01:02:12.529167 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 14 01:02:12.537413 (kubelet)[2155]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 14 01:02:12.570080 kubelet[2155]: E0114 01:02:12.570022 2155 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 14 01:02:12.571521 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 14 01:02:12.571640 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 14 01:02:12.572143 systemd[1]: kubelet.service: Consumed 130ms CPU time, 109.9M memory peak. Jan 14 01:02:22.584289 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 23. Jan 14 01:02:22.585951 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 14 01:02:22.950638 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 14 01:02:22.960303 (kubelet)[2170]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 14 01:02:22.994029 kubelet[2170]: E0114 01:02:22.993950 2170 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 14 01:02:22.996035 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 14 01:02:22.996269 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 14 01:02:22.996710 systemd[1]: kubelet.service: Consumed 135ms CPU time, 108.2M memory peak. Jan 14 01:02:33.084426 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 24. Jan 14 01:02:33.086200 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 14 01:02:33.321069 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 14 01:02:33.334292 (kubelet)[2183]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 14 01:02:33.364674 kubelet[2183]: E0114 01:02:33.364586 2183 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 14 01:02:33.366478 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 14 01:02:33.366667 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 14 01:02:33.367151 systemd[1]: kubelet.service: Consumed 125ms CPU time, 110.2M memory peak. Jan 14 01:02:43.584180 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 25. Jan 14 01:02:43.585825 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 14 01:02:43.946092 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 14 01:02:43.959357 (kubelet)[2199]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 14 01:02:43.993731 kubelet[2199]: E0114 01:02:43.993696 2199 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 14 01:02:43.995623 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 14 01:02:43.995748 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 14 01:02:43.996077 systemd[1]: kubelet.service: Consumed 130ms CPU time, 110.1M memory peak. Jan 14 01:02:54.084190 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 26. Jan 14 01:02:54.085543 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 14 01:02:54.318218 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 14 01:02:54.325339 (kubelet)[2213]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 14 01:02:54.356236 kubelet[2213]: E0114 01:02:54.356143 2213 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 14 01:02:54.358089 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 14 01:02:54.358292 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 14 01:02:54.358786 systemd[1]: kubelet.service: Consumed 127ms CPU time, 110.1M memory peak. Jan 14 01:03:04.584385 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 27. Jan 14 01:03:04.586601 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 14 01:03:04.862526 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 14 01:03:04.872390 (kubelet)[2227]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 14 01:03:04.905725 kubelet[2227]: E0114 01:03:04.905679 2227 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 14 01:03:04.907539 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 14 01:03:04.907732 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 14 01:03:04.908234 systemd[1]: kubelet.service: Consumed 131ms CPU time, 109.9M memory peak. Jan 14 01:03:15.084229 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 28. Jan 14 01:03:15.086042 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 14 01:03:15.304000 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 14 01:03:15.314423 (kubelet)[2242]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 14 01:03:15.360910 kubelet[2242]: E0114 01:03:15.360832 2242 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 14 01:03:15.362799 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 14 01:03:15.362905 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 14 01:03:15.363425 systemd[1]: kubelet.service: Consumed 130ms CPU time, 110M memory peak. Jan 14 01:03:21.991179 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Jan 14 01:03:21.992678 systemd[1]: Started sshd@0-10.0.21.32:22-4.153.228.146:42470.service - OpenSSH per-connection server daemon (4.153.228.146:42470). Jan 14 01:03:22.561744 sshd[2259]: Accepted publickey for core from 4.153.228.146 port 42470 ssh2: RSA SHA256:FB1QBx3PrbvcjURCyfGa9gpun1MfC/bBEvjrV2gePCM Jan 14 01:03:22.563193 sshd-session[2259]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:03:22.571398 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Jan 14 01:03:22.573701 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Jan 14 01:03:22.575572 systemd-logind[1667]: New session 1 of user core. Jan 14 01:03:22.602102 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Jan 14 01:03:22.604240 systemd[1]: Starting user@500.service - User Manager for UID 500... Jan 14 01:03:22.622672 (systemd)[2265]: pam_unix(systemd-user:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:03:22.625018 systemd-logind[1667]: New session 2 of user core. Jan 14 01:03:22.746191 systemd[2265]: Queued start job for default target default.target. Jan 14 01:03:22.754108 systemd[2265]: Created slice app.slice - User Application Slice. Jan 14 01:03:22.754138 systemd[2265]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of User's Temporary Directories. Jan 14 01:03:22.754150 systemd[2265]: Reached target paths.target - Paths. Jan 14 01:03:22.754191 systemd[2265]: Reached target timers.target - Timers. Jan 14 01:03:22.755184 systemd[2265]: Starting dbus.socket - D-Bus User Message Bus Socket... Jan 14 01:03:22.755740 systemd[2265]: Starting systemd-tmpfiles-setup.service - Create User Files and Directories... Jan 14 01:03:22.765568 systemd[2265]: Listening on dbus.socket - D-Bus User Message Bus Socket. Jan 14 01:03:22.765644 systemd[2265]: Reached target sockets.target - Sockets. Jan 14 01:03:22.767768 systemd[2265]: Finished systemd-tmpfiles-setup.service - Create User Files and Directories. Jan 14 01:03:22.767846 systemd[2265]: Reached target basic.target - Basic System. Jan 14 01:03:22.767886 systemd[2265]: Reached target default.target - Main User Target. Jan 14 01:03:22.767909 systemd[2265]: Startup finished in 138ms. Jan 14 01:03:22.768145 systemd[1]: Started user@500.service - User Manager for UID 500. Jan 14 01:03:22.776742 systemd[1]: Started session-1.scope - Session 1 of User core. Jan 14 01:03:23.082295 systemd[1]: Started sshd@1-10.0.21.32:22-4.153.228.146:42486.service - OpenSSH per-connection server daemon (4.153.228.146:42486). Jan 14 01:03:23.620082 sshd[2279]: Accepted publickey for core from 4.153.228.146 port 42486 ssh2: RSA SHA256:FB1QBx3PrbvcjURCyfGa9gpun1MfC/bBEvjrV2gePCM Jan 14 01:03:23.621159 sshd-session[2279]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:03:23.625028 systemd-logind[1667]: New session 3 of user core. Jan 14 01:03:23.633211 systemd[1]: Started session-3.scope - Session 3 of User core. Jan 14 01:03:23.923157 sshd[2283]: Connection closed by 4.153.228.146 port 42486 Jan 14 01:03:23.924835 sshd-session[2279]: pam_unix(sshd:session): session closed for user core Jan 14 01:03:23.928190 systemd[1]: sshd@1-10.0.21.32:22-4.153.228.146:42486.service: Deactivated successfully. Jan 14 01:03:23.930107 systemd[1]: session-3.scope: Deactivated successfully. Jan 14 01:03:23.931426 systemd-logind[1667]: Session 3 logged out. Waiting for processes to exit. Jan 14 01:03:23.932424 systemd-logind[1667]: Removed session 3. Jan 14 01:03:24.030184 systemd[1]: Started sshd@2-10.0.21.32:22-4.153.228.146:42498.service - OpenSSH per-connection server daemon (4.153.228.146:42498). Jan 14 01:03:24.550598 sshd[2289]: Accepted publickey for core from 4.153.228.146 port 42498 ssh2: RSA SHA256:FB1QBx3PrbvcjURCyfGa9gpun1MfC/bBEvjrV2gePCM Jan 14 01:03:24.551176 sshd-session[2289]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:03:24.555154 systemd-logind[1667]: New session 4 of user core. Jan 14 01:03:24.564408 systemd[1]: Started session-4.scope - Session 4 of User core. Jan 14 01:03:24.839923 sshd[2293]: Connection closed by 4.153.228.146 port 42498 Jan 14 01:03:24.840578 sshd-session[2289]: pam_unix(sshd:session): session closed for user core Jan 14 01:03:24.845009 systemd[1]: sshd@2-10.0.21.32:22-4.153.228.146:42498.service: Deactivated successfully. Jan 14 01:03:24.846540 systemd[1]: session-4.scope: Deactivated successfully. Jan 14 01:03:24.847630 systemd-logind[1667]: Session 4 logged out. Waiting for processes to exit. Jan 14 01:03:24.848675 systemd-logind[1667]: Removed session 4. Jan 14 01:03:24.950084 systemd[1]: Started sshd@3-10.0.21.32:22-4.153.228.146:39514.service - OpenSSH per-connection server daemon (4.153.228.146:39514). Jan 14 01:03:25.372963 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 29. Jan 14 01:03:25.376227 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 14 01:03:25.481340 sshd[2299]: Accepted publickey for core from 4.153.228.146 port 39514 ssh2: RSA SHA256:FB1QBx3PrbvcjURCyfGa9gpun1MfC/bBEvjrV2gePCM Jan 14 01:03:25.482901 sshd-session[2299]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:03:25.486982 systemd-logind[1667]: New session 5 of user core. Jan 14 01:03:25.493221 systemd[1]: Started session-5.scope - Session 5 of User core. Jan 14 01:03:25.730638 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 14 01:03:25.743471 (kubelet)[2313]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 14 01:03:25.774918 kubelet[2313]: E0114 01:03:25.774876 2313 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 14 01:03:25.775215 sshd[2306]: Connection closed by 4.153.228.146 port 39514 Jan 14 01:03:25.775485 sshd-session[2299]: pam_unix(sshd:session): session closed for user core Jan 14 01:03:25.779085 systemd-logind[1667]: Session 5 logged out. Waiting for processes to exit. Jan 14 01:03:25.780335 systemd[1]: sshd@3-10.0.21.32:22-4.153.228.146:39514.service: Deactivated successfully. Jan 14 01:03:25.781886 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 14 01:03:25.782017 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 14 01:03:25.782625 systemd[1]: kubelet.service: Consumed 131ms CPU time, 109.9M memory peak. Jan 14 01:03:25.782922 systemd[1]: session-5.scope: Deactivated successfully. Jan 14 01:03:25.784923 systemd-logind[1667]: Removed session 5. Jan 14 01:03:25.883940 systemd[1]: Started sshd@4-10.0.21.32:22-4.153.228.146:39522.service - OpenSSH per-connection server daemon (4.153.228.146:39522). Jan 14 01:03:26.420647 sshd[2324]: Accepted publickey for core from 4.153.228.146 port 39522 ssh2: RSA SHA256:FB1QBx3PrbvcjURCyfGa9gpun1MfC/bBEvjrV2gePCM Jan 14 01:03:26.421153 sshd-session[2324]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:03:26.424549 systemd-logind[1667]: New session 6 of user core. Jan 14 01:03:26.432250 systemd[1]: Started session-6.scope - Session 6 of User core. Jan 14 01:03:26.654464 sudo[2329]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Jan 14 01:03:26.654979 sudo[2329]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 14 01:03:26.669021 sudo[2329]: pam_unix(sudo:session): session closed for user root Jan 14 01:03:26.769474 sshd[2328]: Connection closed by 4.153.228.146 port 39522 Jan 14 01:03:26.769224 sshd-session[2324]: pam_unix(sshd:session): session closed for user core Jan 14 01:03:26.773760 systemd[1]: sshd@4-10.0.21.32:22-4.153.228.146:39522.service: Deactivated successfully. Jan 14 01:03:26.775717 systemd[1]: session-6.scope: Deactivated successfully. Jan 14 01:03:26.777074 systemd-logind[1667]: Session 6 logged out. Waiting for processes to exit. Jan 14 01:03:26.778480 systemd-logind[1667]: Removed session 6. Jan 14 01:03:26.878280 systemd[1]: Started sshd@5-10.0.21.32:22-4.153.228.146:39532.service - OpenSSH per-connection server daemon (4.153.228.146:39532). Jan 14 01:03:27.422088 sshd[2336]: Accepted publickey for core from 4.153.228.146 port 39532 ssh2: RSA SHA256:FB1QBx3PrbvcjURCyfGa9gpun1MfC/bBEvjrV2gePCM Jan 14 01:03:27.422893 sshd-session[2336]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:03:27.426623 systemd-logind[1667]: New session 7 of user core. Jan 14 01:03:27.436242 systemd[1]: Started session-7.scope - Session 7 of User core. Jan 14 01:03:27.626299 sudo[2342]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Jan 14 01:03:27.626834 sudo[2342]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 14 01:03:27.631011 sudo[2342]: pam_unix(sudo:session): session closed for user root Jan 14 01:03:27.636944 sudo[2341]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Jan 14 01:03:27.637191 sudo[2341]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 14 01:03:27.644166 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 14 01:03:27.673000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Jan 14 01:03:27.675360 augenrules[2366]: No rules Jan 14 01:03:27.675528 kernel: kauditd_printk_skb: 38 callbacks suppressed Jan 14 01:03:27.675557 kernel: audit: type=1305 audit(1768352607.673:247): auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Jan 14 01:03:27.677146 systemd[1]: audit-rules.service: Deactivated successfully. Jan 14 01:03:27.677357 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 14 01:03:27.678639 sudo[2341]: pam_unix(sudo:session): session closed for user root Jan 14 01:03:27.673000 audit[2366]: SYSCALL arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7ffdf7a9bb40 a2=420 a3=0 items=0 ppid=2347 pid=2366 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:27.673000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 14 01:03:27.683364 kernel: audit: type=1300 audit(1768352607.673:247): arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7ffdf7a9bb40 a2=420 a3=0 items=0 ppid=2347 pid=2366 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:27.683412 kernel: audit: type=1327 audit(1768352607.673:247): proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 14 01:03:27.676000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:03:27.685673 kernel: audit: type=1130 audit(1768352607.676:248): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:03:27.685705 kernel: audit: type=1131 audit(1768352607.676:249): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:03:27.676000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:03:27.687920 kernel: audit: type=1106 audit(1768352607.677:250): pid=2341 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 14 01:03:27.677000 audit[2341]: USER_END pid=2341 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 14 01:03:27.691073 kernel: audit: type=1104 audit(1768352607.677:251): pid=2341 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 14 01:03:27.677000 audit[2341]: CRED_DISP pid=2341 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 14 01:03:27.780705 sshd[2340]: Connection closed by 4.153.228.146 port 39532 Jan 14 01:03:27.780633 sshd-session[2336]: pam_unix(sshd:session): session closed for user core Jan 14 01:03:27.787215 kernel: audit: type=1106 audit(1768352607.782:252): pid=2336 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:03:27.782000 audit[2336]: USER_END pid=2336 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:03:27.785678 systemd[1]: sshd@5-10.0.21.32:22-4.153.228.146:39532.service: Deactivated successfully. Jan 14 01:03:27.787540 systemd[1]: session-7.scope: Deactivated successfully. Jan 14 01:03:27.782000 audit[2336]: CRED_DISP pid=2336 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:03:27.790541 systemd-logind[1667]: Session 7 logged out. Waiting for processes to exit. Jan 14 01:03:27.783000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@5-10.0.21.32:22-4.153.228.146:39532 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:03:27.793195 kernel: audit: type=1104 audit(1768352607.782:253): pid=2336 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:03:27.793239 kernel: audit: type=1131 audit(1768352607.783:254): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@5-10.0.21.32:22-4.153.228.146:39532 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:03:27.793529 systemd-logind[1667]: Removed session 7. Jan 14 01:03:27.888000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@6-10.0.21.32:22-4.153.228.146:39540 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:03:27.889189 systemd[1]: Started sshd@6-10.0.21.32:22-4.153.228.146:39540.service - OpenSSH per-connection server daemon (4.153.228.146:39540). Jan 14 01:03:28.425000 audit[2375]: USER_ACCT pid=2375 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:03:28.426697 sshd[2375]: Accepted publickey for core from 4.153.228.146 port 39540 ssh2: RSA SHA256:FB1QBx3PrbvcjURCyfGa9gpun1MfC/bBEvjrV2gePCM Jan 14 01:03:28.426000 audit[2375]: CRED_ACQ pid=2375 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:03:28.426000 audit[2375]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff196aef80 a2=3 a3=0 items=0 ppid=1 pid=2375 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=8 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:28.426000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:03:28.427998 sshd-session[2375]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:03:28.431711 systemd-logind[1667]: New session 8 of user core. Jan 14 01:03:28.439445 systemd[1]: Started session-8.scope - Session 8 of User core. Jan 14 01:03:28.440000 audit[2375]: USER_START pid=2375 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:03:28.441000 audit[2379]: CRED_ACQ pid=2379 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:03:28.631446 sudo[2380]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Jan 14 01:03:28.631692 sudo[2380]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 14 01:03:28.630000 audit[2380]: USER_ACCT pid=2380 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_unix,pam_faillock acct="core" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 14 01:03:28.630000 audit[2380]: CRED_REFR pid=2380 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 14 01:03:28.630000 audit[2380]: USER_START pid=2380 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 14 01:03:29.054235 systemd[1]: Starting docker.service - Docker Application Container Engine... Jan 14 01:03:29.071532 (dockerd)[2399]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Jan 14 01:03:29.368520 dockerd[2399]: time="2026-01-14T01:03:29.368474376Z" level=info msg="Starting up" Jan 14 01:03:29.369483 dockerd[2399]: time="2026-01-14T01:03:29.369464103Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Jan 14 01:03:29.379565 dockerd[2399]: time="2026-01-14T01:03:29.379495377Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Jan 14 01:03:29.399740 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport1180495246-merged.mount: Deactivated successfully. Jan 14 01:03:29.440906 dockerd[2399]: time="2026-01-14T01:03:29.440770998Z" level=info msg="Loading containers: start." Jan 14 01:03:29.451075 kernel: Initializing XFRM netlink socket Jan 14 01:03:29.506000 audit[2447]: NETFILTER_CFG table=nat:2 family=2 entries=2 op=nft_register_chain pid=2447 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:03:29.506000 audit[2447]: SYSCALL arch=c000003e syscall=46 success=yes exit=116 a0=3 a1=7ffef725d700 a2=0 a3=0 items=0 ppid=2399 pid=2447 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:29.506000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Jan 14 01:03:29.508000 audit[2449]: NETFILTER_CFG table=filter:3 family=2 entries=2 op=nft_register_chain pid=2449 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:03:29.508000 audit[2449]: SYSCALL arch=c000003e syscall=46 success=yes exit=124 a0=3 a1=7fffdc37f0c0 a2=0 a3=0 items=0 ppid=2399 pid=2449 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:29.508000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Jan 14 01:03:29.510000 audit[2451]: NETFILTER_CFG table=filter:4 family=2 entries=1 op=nft_register_chain pid=2451 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:03:29.510000 audit[2451]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fff5d297f60 a2=0 a3=0 items=0 ppid=2399 pid=2451 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:29.510000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D464F5257415244 Jan 14 01:03:29.511000 audit[2453]: NETFILTER_CFG table=filter:5 family=2 entries=1 op=nft_register_chain pid=2453 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:03:29.511000 audit[2453]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffc7dea0780 a2=0 a3=0 items=0 ppid=2399 pid=2453 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:29.511000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D425249444745 Jan 14 01:03:29.513000 audit[2455]: NETFILTER_CFG table=filter:6 family=2 entries=1 op=nft_register_chain pid=2455 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:03:29.513000 audit[2455]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffe6e95e860 a2=0 a3=0 items=0 ppid=2399 pid=2455 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:29.513000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D4354 Jan 14 01:03:29.515000 audit[2457]: NETFILTER_CFG table=filter:7 family=2 entries=1 op=nft_register_chain pid=2457 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:03:29.515000 audit[2457]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7fff76023b30 a2=0 a3=0 items=0 ppid=2399 pid=2457 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:29.515000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 14 01:03:29.517000 audit[2459]: NETFILTER_CFG table=filter:8 family=2 entries=1 op=nft_register_chain pid=2459 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:03:29.517000 audit[2459]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7ffdc9238150 a2=0 a3=0 items=0 ppid=2399 pid=2459 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:29.517000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 14 01:03:29.518000 audit[2461]: NETFILTER_CFG table=nat:9 family=2 entries=2 op=nft_register_chain pid=2461 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:03:29.518000 audit[2461]: SYSCALL arch=c000003e syscall=46 success=yes exit=384 a0=3 a1=7fff68b41060 a2=0 a3=0 items=0 ppid=2399 pid=2461 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:29.518000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Jan 14 01:03:29.558000 audit[2464]: NETFILTER_CFG table=nat:10 family=2 entries=2 op=nft_register_chain pid=2464 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:03:29.558000 audit[2464]: SYSCALL arch=c000003e syscall=46 success=yes exit=472 a0=3 a1=7ffe17363690 a2=0 a3=0 items=0 ppid=2399 pid=2464 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:29.558000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003132372E302E302E302F38 Jan 14 01:03:29.560000 audit[2466]: NETFILTER_CFG table=filter:11 family=2 entries=2 op=nft_register_chain pid=2466 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:03:29.560000 audit[2466]: SYSCALL arch=c000003e syscall=46 success=yes exit=340 a0=3 a1=7ffc3d8c90d0 a2=0 a3=0 items=0 ppid=2399 pid=2466 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:29.560000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D464F5257415244 Jan 14 01:03:29.562000 audit[2468]: NETFILTER_CFG table=filter:12 family=2 entries=1 op=nft_register_rule pid=2468 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:03:29.562000 audit[2468]: SYSCALL arch=c000003e syscall=46 success=yes exit=236 a0=3 a1=7ffff64c31a0 a2=0 a3=0 items=0 ppid=2399 pid=2468 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:29.562000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D425249444745 Jan 14 01:03:29.564000 audit[2470]: NETFILTER_CFG table=filter:13 family=2 entries=1 op=nft_register_rule pid=2470 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:03:29.564000 audit[2470]: SYSCALL arch=c000003e syscall=46 success=yes exit=248 a0=3 a1=7ffed344fd40 a2=0 a3=0 items=0 ppid=2399 pid=2470 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:29.564000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 14 01:03:29.566000 audit[2472]: NETFILTER_CFG table=filter:14 family=2 entries=1 op=nft_register_rule pid=2472 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:03:29.566000 audit[2472]: SYSCALL arch=c000003e syscall=46 success=yes exit=232 a0=3 a1=7ffd685633d0 a2=0 a3=0 items=0 ppid=2399 pid=2472 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:29.566000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D4354 Jan 14 01:03:29.603000 audit[2502]: NETFILTER_CFG table=nat:15 family=10 entries=2 op=nft_register_chain pid=2502 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:03:29.603000 audit[2502]: SYSCALL arch=c000003e syscall=46 success=yes exit=116 a0=3 a1=7ffc2d1e40a0 a2=0 a3=0 items=0 ppid=2399 pid=2502 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:29.603000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Jan 14 01:03:29.605000 audit[2504]: NETFILTER_CFG table=filter:16 family=10 entries=2 op=nft_register_chain pid=2504 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:03:29.605000 audit[2504]: SYSCALL arch=c000003e syscall=46 success=yes exit=124 a0=3 a1=7fff57c327a0 a2=0 a3=0 items=0 ppid=2399 pid=2504 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:29.605000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Jan 14 01:03:29.606000 audit[2506]: NETFILTER_CFG table=filter:17 family=10 entries=1 op=nft_register_chain pid=2506 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:03:29.606000 audit[2506]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffd28550390 a2=0 a3=0 items=0 ppid=2399 pid=2506 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:29.606000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D464F5257415244 Jan 14 01:03:29.608000 audit[2508]: NETFILTER_CFG table=filter:18 family=10 entries=1 op=nft_register_chain pid=2508 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:03:29.608000 audit[2508]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffc11b53b60 a2=0 a3=0 items=0 ppid=2399 pid=2508 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:29.608000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D425249444745 Jan 14 01:03:29.610000 audit[2510]: NETFILTER_CFG table=filter:19 family=10 entries=1 op=nft_register_chain pid=2510 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:03:29.610000 audit[2510]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffdeaa96ea0 a2=0 a3=0 items=0 ppid=2399 pid=2510 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:29.610000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D4354 Jan 14 01:03:29.612000 audit[2512]: NETFILTER_CFG table=filter:20 family=10 entries=1 op=nft_register_chain pid=2512 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:03:29.612000 audit[2512]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7ffdeae5d9f0 a2=0 a3=0 items=0 ppid=2399 pid=2512 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:29.612000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 14 01:03:29.614000 audit[2514]: NETFILTER_CFG table=filter:21 family=10 entries=1 op=nft_register_chain pid=2514 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:03:29.614000 audit[2514]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7ffeb3f6cdb0 a2=0 a3=0 items=0 ppid=2399 pid=2514 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:29.614000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 14 01:03:29.615000 audit[2516]: NETFILTER_CFG table=nat:22 family=10 entries=2 op=nft_register_chain pid=2516 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:03:29.615000 audit[2516]: SYSCALL arch=c000003e syscall=46 success=yes exit=384 a0=3 a1=7ffc6ae4b110 a2=0 a3=0 items=0 ppid=2399 pid=2516 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:29.615000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Jan 14 01:03:29.618000 audit[2518]: NETFILTER_CFG table=nat:23 family=10 entries=2 op=nft_register_chain pid=2518 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:03:29.618000 audit[2518]: SYSCALL arch=c000003e syscall=46 success=yes exit=484 a0=3 a1=7fffcc0cece0 a2=0 a3=0 items=0 ppid=2399 pid=2518 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:29.618000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003A3A312F313238 Jan 14 01:03:29.620000 audit[2520]: NETFILTER_CFG table=filter:24 family=10 entries=2 op=nft_register_chain pid=2520 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:03:29.620000 audit[2520]: SYSCALL arch=c000003e syscall=46 success=yes exit=340 a0=3 a1=7fff3d4874c0 a2=0 a3=0 items=0 ppid=2399 pid=2520 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:29.620000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D464F5257415244 Jan 14 01:03:29.622000 audit[2522]: NETFILTER_CFG table=filter:25 family=10 entries=1 op=nft_register_rule pid=2522 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:03:29.622000 audit[2522]: SYSCALL arch=c000003e syscall=46 success=yes exit=236 a0=3 a1=7ffe8ab3ffa0 a2=0 a3=0 items=0 ppid=2399 pid=2522 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:29.622000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D425249444745 Jan 14 01:03:29.623000 audit[2524]: NETFILTER_CFG table=filter:26 family=10 entries=1 op=nft_register_rule pid=2524 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:03:29.623000 audit[2524]: SYSCALL arch=c000003e syscall=46 success=yes exit=248 a0=3 a1=7ffc6b16ec10 a2=0 a3=0 items=0 ppid=2399 pid=2524 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:29.623000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 14 01:03:29.625000 audit[2526]: NETFILTER_CFG table=filter:27 family=10 entries=1 op=nft_register_rule pid=2526 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:03:29.625000 audit[2526]: SYSCALL arch=c000003e syscall=46 success=yes exit=232 a0=3 a1=7ffee654ce80 a2=0 a3=0 items=0 ppid=2399 pid=2526 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:29.625000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D4354 Jan 14 01:03:29.630000 audit[2531]: NETFILTER_CFG table=filter:28 family=2 entries=1 op=nft_register_chain pid=2531 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:03:29.630000 audit[2531]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffd8012fcc0 a2=0 a3=0 items=0 ppid=2399 pid=2531 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:29.630000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Jan 14 01:03:29.631000 audit[2533]: NETFILTER_CFG table=filter:29 family=2 entries=1 op=nft_register_rule pid=2533 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:03:29.631000 audit[2533]: SYSCALL arch=c000003e syscall=46 success=yes exit=212 a0=3 a1=7fff96573a30 a2=0 a3=0 items=0 ppid=2399 pid=2533 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:29.631000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Jan 14 01:03:29.633000 audit[2535]: NETFILTER_CFG table=filter:30 family=2 entries=1 op=nft_register_rule pid=2535 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:03:29.633000 audit[2535]: SYSCALL arch=c000003e syscall=46 success=yes exit=224 a0=3 a1=7fffa2ecba90 a2=0 a3=0 items=0 ppid=2399 pid=2535 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:29.633000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Jan 14 01:03:29.634000 audit[2537]: NETFILTER_CFG table=filter:31 family=10 entries=1 op=nft_register_chain pid=2537 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:03:29.634000 audit[2537]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffcf9691390 a2=0 a3=0 items=0 ppid=2399 pid=2537 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:29.634000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Jan 14 01:03:29.636000 audit[2539]: NETFILTER_CFG table=filter:32 family=10 entries=1 op=nft_register_rule pid=2539 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:03:29.636000 audit[2539]: SYSCALL arch=c000003e syscall=46 success=yes exit=212 a0=3 a1=7ffe36704ab0 a2=0 a3=0 items=0 ppid=2399 pid=2539 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:29.636000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Jan 14 01:03:29.638000 audit[2541]: NETFILTER_CFG table=filter:33 family=10 entries=1 op=nft_register_rule pid=2541 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:03:29.638000 audit[2541]: SYSCALL arch=c000003e syscall=46 success=yes exit=224 a0=3 a1=7fffbdaf3750 a2=0 a3=0 items=0 ppid=2399 pid=2541 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:29.638000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Jan 14 01:03:29.676000 audit[2546]: NETFILTER_CFG table=nat:34 family=2 entries=2 op=nft_register_chain pid=2546 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:03:29.676000 audit[2546]: SYSCALL arch=c000003e syscall=46 success=yes exit=520 a0=3 a1=7ffc8961af10 a2=0 a3=0 items=0 ppid=2399 pid=2546 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:29.676000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4900504F5354524F5554494E47002D73003137322E31372E302E302F31360000002D6F00646F636B657230002D6A004D415351554552414445 Jan 14 01:03:29.678000 audit[2548]: NETFILTER_CFG table=nat:35 family=2 entries=1 op=nft_register_rule pid=2548 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:03:29.678000 audit[2548]: SYSCALL arch=c000003e syscall=46 success=yes exit=288 a0=3 a1=7ffd7e739620 a2=0 a3=0 items=0 ppid=2399 pid=2548 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:29.678000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4900444F434B4552002D6900646F636B657230002D6A0052455455524E Jan 14 01:03:29.686000 audit[2556]: NETFILTER_CFG table=filter:36 family=2 entries=1 op=nft_register_rule pid=2556 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:03:29.686000 audit[2556]: SYSCALL arch=c000003e syscall=46 success=yes exit=300 a0=3 a1=7ffd16d188e0 a2=0 a3=0 items=0 ppid=2399 pid=2556 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:29.686000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D464F5257415244002D6900646F636B657230002D6A00414343455054 Jan 14 01:03:29.698000 audit[2562]: NETFILTER_CFG table=filter:37 family=2 entries=1 op=nft_register_rule pid=2562 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:03:29.698000 audit[2562]: SYSCALL arch=c000003e syscall=46 success=yes exit=376 a0=3 a1=7ffddb727670 a2=0 a3=0 items=0 ppid=2399 pid=2562 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:29.698000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45520000002D6900646F636B657230002D6F00646F636B657230002D6A0044524F50 Jan 14 01:03:29.700000 audit[2564]: NETFILTER_CFG table=filter:38 family=2 entries=1 op=nft_register_rule pid=2564 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:03:29.700000 audit[2564]: SYSCALL arch=c000003e syscall=46 success=yes exit=512 a0=3 a1=7ffeeba5fe90 a2=0 a3=0 items=0 ppid=2399 pid=2564 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:29.700000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D4354002D6F00646F636B657230002D6D00636F6E6E747261636B002D2D637473746174650052454C415445442C45535441424C4953484544002D6A00414343455054 Jan 14 01:03:29.702000 audit[2566]: NETFILTER_CFG table=filter:39 family=2 entries=1 op=nft_register_rule pid=2566 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:03:29.702000 audit[2566]: SYSCALL arch=c000003e syscall=46 success=yes exit=312 a0=3 a1=7ffe1854f540 a2=0 a3=0 items=0 ppid=2399 pid=2566 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:29.702000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D425249444745002D6F00646F636B657230002D6A00444F434B4552 Jan 14 01:03:29.704000 audit[2568]: NETFILTER_CFG table=filter:40 family=2 entries=1 op=nft_register_rule pid=2568 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:03:29.704000 audit[2568]: SYSCALL arch=c000003e syscall=46 success=yes exit=428 a0=3 a1=7fff652848e0 a2=0 a3=0 items=0 ppid=2399 pid=2568 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:29.704000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D49534F4C4154494F4E2D53544147452D31002D6900646F636B6572300000002D6F00646F636B657230002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 14 01:03:29.706000 audit[2570]: NETFILTER_CFG table=filter:41 family=2 entries=1 op=nft_register_rule pid=2570 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:03:29.706000 audit[2570]: SYSCALL arch=c000003e syscall=46 success=yes exit=312 a0=3 a1=7ffda54315b0 a2=0 a3=0 items=0 ppid=2399 pid=2570 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:29.706000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4900444F434B45522D49534F4C4154494F4E2D53544147452D32002D6F00646F636B657230002D6A0044524F50 Jan 14 01:03:29.708108 systemd-networkd[1385]: docker0: Link UP Jan 14 01:03:29.715379 dockerd[2399]: time="2026-01-14T01:03:29.715287506Z" level=info msg="Loading containers: done." Jan 14 01:03:29.725883 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck2238517768-merged.mount: Deactivated successfully. Jan 14 01:03:29.736542 dockerd[2399]: time="2026-01-14T01:03:29.736481863Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Jan 14 01:03:29.736692 dockerd[2399]: time="2026-01-14T01:03:29.736559378Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Jan 14 01:03:29.736692 dockerd[2399]: time="2026-01-14T01:03:29.736625634Z" level=info msg="Initializing buildkit" Jan 14 01:03:29.767571 dockerd[2399]: time="2026-01-14T01:03:29.767524680Z" level=info msg="Completed buildkit initialization" Jan 14 01:03:29.773130 dockerd[2399]: time="2026-01-14T01:03:29.773076602Z" level=info msg="Daemon has completed initialization" Jan 14 01:03:29.773433 dockerd[2399]: time="2026-01-14T01:03:29.773265599Z" level=info msg="API listen on /run/docker.sock" Jan 14 01:03:29.777427 systemd[1]: Started docker.service - Docker Application Container Engine. Jan 14 01:03:29.776000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=docker comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:03:31.074202 containerd[1695]: time="2026-01-14T01:03:31.074118697Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.34.3\"" Jan 14 01:03:31.928180 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1781282737.mount: Deactivated successfully. Jan 14 01:03:32.610021 containerd[1695]: time="2026-01-14T01:03:32.609245454Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.34.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:03:32.611526 containerd[1695]: time="2026-01-14T01:03:32.611505129Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.34.3: active requests=0, bytes read=25399329" Jan 14 01:03:32.613153 containerd[1695]: time="2026-01-14T01:03:32.613118094Z" level=info msg="ImageCreate event name:\"sha256:aa27095f5619377172f3d59289ccb2ba567ebea93a736d1705be068b2c030b0c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:03:32.616082 containerd[1695]: time="2026-01-14T01:03:32.616061184Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:5af1030676ceca025742ef5e73a504d11b59be0e5551cdb8c9cf0d3c1231b460\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:03:32.616697 containerd[1695]: time="2026-01-14T01:03:32.616671039Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.34.3\" with image id \"sha256:aa27095f5619377172f3d59289ccb2ba567ebea93a736d1705be068b2c030b0c\", repo tag \"registry.k8s.io/kube-apiserver:v1.34.3\", repo digest \"registry.k8s.io/kube-apiserver@sha256:5af1030676ceca025742ef5e73a504d11b59be0e5551cdb8c9cf0d3c1231b460\", size \"27064672\" in 1.542523089s" Jan 14 01:03:32.616751 containerd[1695]: time="2026-01-14T01:03:32.616700570Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.34.3\" returns image reference \"sha256:aa27095f5619377172f3d59289ccb2ba567ebea93a736d1705be068b2c030b0c\"" Jan 14 01:03:32.617242 containerd[1695]: time="2026-01-14T01:03:32.617228682Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.34.3\"" Jan 14 01:03:33.794145 containerd[1695]: time="2026-01-14T01:03:33.794041627Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.34.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:03:33.796966 containerd[1695]: time="2026-01-14T01:03:33.796689011Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.34.3: active requests=0, bytes read=21154285" Jan 14 01:03:33.809688 containerd[1695]: time="2026-01-14T01:03:33.809655029Z" level=info msg="ImageCreate event name:\"sha256:5826b25d990d7d314d236c8d128f43e443583891f5cdffa7bf8bca50ae9e0942\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:03:33.815468 containerd[1695]: time="2026-01-14T01:03:33.815434433Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:716a210d31ee5e27053ea0e1a3a3deb4910791a85ba4b1120410b5a4cbcf1954\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:03:33.816270 containerd[1695]: time="2026-01-14T01:03:33.816246167Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.34.3\" with image id \"sha256:5826b25d990d7d314d236c8d128f43e443583891f5cdffa7bf8bca50ae9e0942\", repo tag \"registry.k8s.io/kube-controller-manager:v1.34.3\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:716a210d31ee5e27053ea0e1a3a3deb4910791a85ba4b1120410b5a4cbcf1954\", size \"22819474\" in 1.198949657s" Jan 14 01:03:33.816407 containerd[1695]: time="2026-01-14T01:03:33.816331051Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.34.3\" returns image reference \"sha256:5826b25d990d7d314d236c8d128f43e443583891f5cdffa7bf8bca50ae9e0942\"" Jan 14 01:03:33.816840 containerd[1695]: time="2026-01-14T01:03:33.816814737Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.34.3\"" Jan 14 01:03:35.082100 containerd[1695]: time="2026-01-14T01:03:35.082043317Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.34.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:03:35.083487 containerd[1695]: time="2026-01-14T01:03:35.083323196Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.34.3: active requests=0, bytes read=15717792" Jan 14 01:03:35.085767 containerd[1695]: time="2026-01-14T01:03:35.085743322Z" level=info msg="ImageCreate event name:\"sha256:aec12dadf56dd45659a682b94571f115a1be02ee4a262b3b5176394f5c030c78\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:03:35.089017 containerd[1695]: time="2026-01-14T01:03:35.088993821Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:f9a9bc7948fd804ef02255fe82ac2e85d2a66534bae2fe1348c14849260a1fe2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:03:35.089630 containerd[1695]: time="2026-01-14T01:03:35.089607995Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.34.3\" with image id \"sha256:aec12dadf56dd45659a682b94571f115a1be02ee4a262b3b5176394f5c030c78\", repo tag \"registry.k8s.io/kube-scheduler:v1.34.3\", repo digest \"registry.k8s.io/kube-scheduler@sha256:f9a9bc7948fd804ef02255fe82ac2e85d2a66534bae2fe1348c14849260a1fe2\", size \"17382979\" in 1.272768876s" Jan 14 01:03:35.089675 containerd[1695]: time="2026-01-14T01:03:35.089635272Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.34.3\" returns image reference \"sha256:aec12dadf56dd45659a682b94571f115a1be02ee4a262b3b5176394f5c030c78\"" Jan 14 01:03:35.090411 containerd[1695]: time="2026-01-14T01:03:35.090396635Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.34.3\"" Jan 14 01:03:35.834270 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 30. Jan 14 01:03:35.836064 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 14 01:03:35.963197 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 14 01:03:35.968134 kernel: kauditd_printk_skb: 132 callbacks suppressed Jan 14 01:03:35.968219 kernel: audit: type=1130 audit(1768352615.963:305): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:03:35.963000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:03:35.975307 (kubelet)[2688]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 14 01:03:36.016979 kubelet[2688]: E0114 01:03:36.016942 2688 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 14 01:03:36.019222 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 14 01:03:36.019339 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 14 01:03:36.019796 systemd[1]: kubelet.service: Consumed 132ms CPU time, 110.1M memory peak. Jan 14 01:03:36.023175 kernel: audit: type=1131 audit(1768352616.018:306): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 14 01:03:36.018000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 14 01:03:36.104838 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2558380430.mount: Deactivated successfully. Jan 14 01:03:37.485265 containerd[1695]: time="2026-01-14T01:03:37.485227131Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.34.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:03:37.486976 containerd[1695]: time="2026-01-14T01:03:37.486960293Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.34.3: active requests=0, bytes read=0" Jan 14 01:03:37.488612 containerd[1695]: time="2026-01-14T01:03:37.488594533Z" level=info msg="ImageCreate event name:\"sha256:36eef8e07bdd6abdc2bbf44041e49480fe499a3cedb0ae054b50daa1a35cf691\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:03:37.491528 containerd[1695]: time="2026-01-14T01:03:37.491509560Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:7298ab89a103523d02ff4f49bedf9359710af61df92efdc07bac873064f03ed6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:03:37.492064 containerd[1695]: time="2026-01-14T01:03:37.491808108Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.34.3\" with image id \"sha256:36eef8e07bdd6abdc2bbf44041e49480fe499a3cedb0ae054b50daa1a35cf691\", repo tag \"registry.k8s.io/kube-proxy:v1.34.3\", repo digest \"registry.k8s.io/kube-proxy@sha256:7298ab89a103523d02ff4f49bedf9359710af61df92efdc07bac873064f03ed6\", size \"25964312\" in 2.401289013s" Jan 14 01:03:37.492269 containerd[1695]: time="2026-01-14T01:03:37.492258093Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.34.3\" returns image reference \"sha256:36eef8e07bdd6abdc2bbf44041e49480fe499a3cedb0ae054b50daa1a35cf691\"" Jan 14 01:03:37.492786 containerd[1695]: time="2026-01-14T01:03:37.492693414Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.1\"" Jan 14 01:03:38.103337 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount819078109.mount: Deactivated successfully. Jan 14 01:03:38.743068 containerd[1695]: time="2026-01-14T01:03:38.742938099Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:03:38.745036 containerd[1695]: time="2026-01-14T01:03:38.745002700Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.1: active requests=0, bytes read=21568511" Jan 14 01:03:38.746507 containerd[1695]: time="2026-01-14T01:03:38.746473840Z" level=info msg="ImageCreate event name:\"sha256:52546a367cc9e0d924aa3b190596a9167fa6e53245023b5b5baf0f07e5443969\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:03:38.750062 containerd[1695]: time="2026-01-14T01:03:38.749881468Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:03:38.750397 containerd[1695]: time="2026-01-14T01:03:38.750380846Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.1\" with image id \"sha256:52546a367cc9e0d924aa3b190596a9167fa6e53245023b5b5baf0f07e5443969\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c\", size \"22384805\" in 1.257640954s" Jan 14 01:03:38.750451 containerd[1695]: time="2026-01-14T01:03:38.750442881Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.1\" returns image reference \"sha256:52546a367cc9e0d924aa3b190596a9167fa6e53245023b5b5baf0f07e5443969\"" Jan 14 01:03:38.751075 containerd[1695]: time="2026-01-14T01:03:38.751055516Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\"" Jan 14 01:03:39.348677 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2830700842.mount: Deactivated successfully. Jan 14 01:03:39.360816 containerd[1695]: time="2026-01-14T01:03:39.360614092Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:03:39.362350 containerd[1695]: time="2026-01-14T01:03:39.362299857Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10.1: active requests=0, bytes read=0" Jan 14 01:03:39.363990 containerd[1695]: time="2026-01-14T01:03:39.363940360Z" level=info msg="ImageCreate event name:\"sha256:cd073f4c5f6a8e9dc6f3125ba00cf60819cae95c1ec84a1f146ee4a9cf9e803f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:03:39.366566 containerd[1695]: time="2026-01-14T01:03:39.366449406Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:03:39.367097 containerd[1695]: time="2026-01-14T01:03:39.366953619Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10.1\" with image id \"sha256:cd073f4c5f6a8e9dc6f3125ba00cf60819cae95c1ec84a1f146ee4a9cf9e803f\", repo tag \"registry.k8s.io/pause:3.10.1\", repo digest \"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\", size \"320448\" in 615.797427ms" Jan 14 01:03:39.367097 containerd[1695]: time="2026-01-14T01:03:39.366979170Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\" returns image reference \"sha256:cd073f4c5f6a8e9dc6f3125ba00cf60819cae95c1ec84a1f146ee4a9cf9e803f\"" Jan 14 01:03:39.367666 containerd[1695]: time="2026-01-14T01:03:39.367645852Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.4-0\"" Jan 14 01:03:39.927761 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1473088291.mount: Deactivated successfully. Jan 14 01:03:42.260910 containerd[1695]: time="2026-01-14T01:03:42.260776827Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.6.4-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:03:42.263415 containerd[1695]: time="2026-01-14T01:03:42.263287505Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.6.4-0: active requests=0, bytes read=61186606" Jan 14 01:03:42.264861 containerd[1695]: time="2026-01-14T01:03:42.264823075Z" level=info msg="ImageCreate event name:\"sha256:5f1f5298c888daa46c4409ff4cefe5ca9d16e479419f94cdb5f5d5563dac0115\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:03:42.268827 containerd[1695]: time="2026-01-14T01:03:42.268782660Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:e36c081683425b5b3bc1425bc508b37e7107bb65dfa9367bf5a80125d431fa19\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:03:42.269631 containerd[1695]: time="2026-01-14T01:03:42.269597867Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.6.4-0\" with image id \"sha256:5f1f5298c888daa46c4409ff4cefe5ca9d16e479419f94cdb5f5d5563dac0115\", repo tag \"registry.k8s.io/etcd:3.6.4-0\", repo digest \"registry.k8s.io/etcd@sha256:e36c081683425b5b3bc1425bc508b37e7107bb65dfa9367bf5a80125d431fa19\", size \"74311308\" in 2.901928844s" Jan 14 01:03:42.269631 containerd[1695]: time="2026-01-14T01:03:42.269622279Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.4-0\" returns image reference \"sha256:5f1f5298c888daa46c4409ff4cefe5ca9d16e479419f94cdb5f5d5563dac0115\"" Jan 14 01:03:45.724309 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 14 01:03:45.723000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:03:45.724563 systemd[1]: kubelet.service: Consumed 132ms CPU time, 110.1M memory peak. Jan 14 01:03:45.723000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:03:45.727633 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 14 01:03:45.728397 kernel: audit: type=1130 audit(1768352625.723:307): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:03:45.728442 kernel: audit: type=1131 audit(1768352625.723:308): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:03:45.754108 systemd[1]: Reload requested from client PID 2835 ('systemctl') (unit session-8.scope)... Jan 14 01:03:45.754119 systemd[1]: Reloading... Jan 14 01:03:45.843079 zram_generator::config[2877]: No configuration found. Jan 14 01:03:46.028791 systemd[1]: Reloading finished in 274 ms. Jan 14 01:03:46.060921 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Jan 14 01:03:46.061144 systemd[1]: kubelet.service: Failed with result 'signal'. Jan 14 01:03:46.061451 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 14 01:03:46.060000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 14 01:03:46.061603 systemd[1]: kubelet.service: Consumed 75ms CPU time, 92.7M memory peak. Jan 14 01:03:46.066097 kernel: audit: type=1130 audit(1768352626.060:309): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 14 01:03:46.066256 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 14 01:03:46.068000 audit: BPF prog-id=67 op=LOAD Jan 14 01:03:46.071705 kernel: audit: type=1334 audit(1768352626.068:310): prog-id=67 op=LOAD Jan 14 01:03:46.071757 kernel: audit: type=1334 audit(1768352626.068:311): prog-id=64 op=UNLOAD Jan 14 01:03:46.068000 audit: BPF prog-id=64 op=UNLOAD Jan 14 01:03:46.068000 audit: BPF prog-id=68 op=LOAD Jan 14 01:03:46.068000 audit: BPF prog-id=69 op=LOAD Jan 14 01:03:46.068000 audit: BPF prog-id=65 op=UNLOAD Jan 14 01:03:46.068000 audit: BPF prog-id=66 op=UNLOAD Jan 14 01:03:46.070000 audit: BPF prog-id=70 op=LOAD Jan 14 01:03:46.070000 audit: BPF prog-id=63 op=UNLOAD Jan 14 01:03:46.072061 kernel: audit: type=1334 audit(1768352626.068:312): prog-id=68 op=LOAD Jan 14 01:03:46.072084 kernel: audit: type=1334 audit(1768352626.068:313): prog-id=69 op=LOAD Jan 14 01:03:46.072101 kernel: audit: type=1334 audit(1768352626.068:314): prog-id=65 op=UNLOAD Jan 14 01:03:46.072156 kernel: audit: type=1334 audit(1768352626.068:315): prog-id=66 op=UNLOAD Jan 14 01:03:46.072173 kernel: audit: type=1334 audit(1768352626.070:316): prog-id=70 op=LOAD Jan 14 01:03:46.074000 audit: BPF prog-id=71 op=LOAD Jan 14 01:03:46.074000 audit: BPF prog-id=57 op=UNLOAD Jan 14 01:03:46.074000 audit: BPF prog-id=72 op=LOAD Jan 14 01:03:46.074000 audit: BPF prog-id=73 op=LOAD Jan 14 01:03:46.074000 audit: BPF prog-id=58 op=UNLOAD Jan 14 01:03:46.074000 audit: BPF prog-id=59 op=UNLOAD Jan 14 01:03:46.074000 audit: BPF prog-id=74 op=LOAD Jan 14 01:03:46.074000 audit: BPF prog-id=60 op=UNLOAD Jan 14 01:03:46.074000 audit: BPF prog-id=75 op=LOAD Jan 14 01:03:46.074000 audit: BPF prog-id=76 op=LOAD Jan 14 01:03:46.074000 audit: BPF prog-id=61 op=UNLOAD Jan 14 01:03:46.074000 audit: BPF prog-id=62 op=UNLOAD Jan 14 01:03:46.074000 audit: BPF prog-id=77 op=LOAD Jan 14 01:03:46.074000 audit: BPF prog-id=78 op=LOAD Jan 14 01:03:46.074000 audit: BPF prog-id=51 op=UNLOAD Jan 14 01:03:46.074000 audit: BPF prog-id=52 op=UNLOAD Jan 14 01:03:46.075000 audit: BPF prog-id=79 op=LOAD Jan 14 01:03:46.075000 audit: BPF prog-id=50 op=UNLOAD Jan 14 01:03:46.075000 audit: BPF prog-id=80 op=LOAD Jan 14 01:03:46.075000 audit: BPF prog-id=56 op=UNLOAD Jan 14 01:03:46.076000 audit: BPF prog-id=81 op=LOAD Jan 14 01:03:46.076000 audit: BPF prog-id=53 op=UNLOAD Jan 14 01:03:46.076000 audit: BPF prog-id=82 op=LOAD Jan 14 01:03:46.076000 audit: BPF prog-id=83 op=LOAD Jan 14 01:03:46.076000 audit: BPF prog-id=54 op=UNLOAD Jan 14 01:03:46.076000 audit: BPF prog-id=55 op=UNLOAD Jan 14 01:03:46.076000 audit: BPF prog-id=84 op=LOAD Jan 14 01:03:46.076000 audit: BPF prog-id=47 op=UNLOAD Jan 14 01:03:46.077000 audit: BPF prog-id=85 op=LOAD Jan 14 01:03:46.077000 audit: BPF prog-id=86 op=LOAD Jan 14 01:03:46.077000 audit: BPF prog-id=48 op=UNLOAD Jan 14 01:03:46.077000 audit: BPF prog-id=49 op=UNLOAD Jan 14 01:03:47.077992 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 14 01:03:47.077000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:03:47.087473 (kubelet)[2932]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 14 01:03:47.124217 kubelet[2932]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Jan 14 01:03:47.124217 kubelet[2932]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 14 01:03:47.124217 kubelet[2932]: I0114 01:03:47.124130 2932 server.go:213] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 14 01:03:47.869803 kubelet[2932]: I0114 01:03:47.869042 2932 server.go:529] "Kubelet version" kubeletVersion="v1.34.1" Jan 14 01:03:47.869803 kubelet[2932]: I0114 01:03:47.869078 2932 server.go:531] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 14 01:03:47.869803 kubelet[2932]: I0114 01:03:47.869101 2932 watchdog_linux.go:95] "Systemd watchdog is not enabled" Jan 14 01:03:47.869803 kubelet[2932]: I0114 01:03:47.869107 2932 watchdog_linux.go:137] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Jan 14 01:03:47.869803 kubelet[2932]: I0114 01:03:47.869456 2932 server.go:956] "Client rotation is on, will bootstrap in background" Jan 14 01:03:47.919219 kubelet[2932]: E0114 01:03:47.919180 2932 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://10.0.21.32:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.21.32:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Jan 14 01:03:47.920096 kubelet[2932]: I0114 01:03:47.920078 2932 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 14 01:03:47.943963 kubelet[2932]: I0114 01:03:47.943939 2932 server.go:1423] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 14 01:03:47.946568 kubelet[2932]: I0114 01:03:47.946553 2932 server.go:781] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Jan 14 01:03:47.946805 kubelet[2932]: I0114 01:03:47.946788 2932 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 14 01:03:47.946975 kubelet[2932]: I0114 01:03:47.946852 2932 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4547-0-0-n-de0c74fc75","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 14 01:03:47.947095 kubelet[2932]: I0114 01:03:47.947088 2932 topology_manager.go:138] "Creating topology manager with none policy" Jan 14 01:03:47.947130 kubelet[2932]: I0114 01:03:47.947127 2932 container_manager_linux.go:306] "Creating device plugin manager" Jan 14 01:03:47.947230 kubelet[2932]: I0114 01:03:47.947224 2932 container_manager_linux.go:315] "Creating Dynamic Resource Allocation (DRA) manager" Jan 14 01:03:48.067553 kubelet[2932]: I0114 01:03:48.067519 2932 state_mem.go:36] "Initialized new in-memory state store" Jan 14 01:03:48.067915 kubelet[2932]: I0114 01:03:48.067858 2932 kubelet.go:475] "Attempting to sync node with API server" Jan 14 01:03:48.067915 kubelet[2932]: I0114 01:03:48.067871 2932 kubelet.go:376] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 14 01:03:48.067915 kubelet[2932]: I0114 01:03:48.067888 2932 kubelet.go:387] "Adding apiserver pod source" Jan 14 01:03:48.067915 kubelet[2932]: I0114 01:03:48.067901 2932 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 14 01:03:48.068415 kubelet[2932]: E0114 01:03:48.068381 2932 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.0.21.32:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4547-0-0-n-de0c74fc75&limit=500&resourceVersion=0\": dial tcp 10.0.21.32:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Jan 14 01:03:48.074323 kubelet[2932]: E0114 01:03:48.074294 2932 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.0.21.32:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.21.32:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Jan 14 01:03:48.093536 kubelet[2932]: I0114 01:03:48.093512 2932 kuberuntime_manager.go:291] "Container runtime initialized" containerRuntime="containerd" version="v2.1.5" apiVersion="v1" Jan 14 01:03:48.094158 kubelet[2932]: I0114 01:03:48.094009 2932 kubelet.go:940] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Jan 14 01:03:48.094158 kubelet[2932]: I0114 01:03:48.094036 2932 kubelet.go:964] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Jan 14 01:03:48.094158 kubelet[2932]: W0114 01:03:48.094086 2932 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Jan 14 01:03:48.169144 kubelet[2932]: I0114 01:03:48.169075 2932 server.go:1262] "Started kubelet" Jan 14 01:03:48.424684 kubelet[2932]: I0114 01:03:48.424520 2932 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 14 01:03:48.427000 audit[2945]: NETFILTER_CFG table=mangle:42 family=2 entries=2 op=nft_register_chain pid=2945 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:03:48.427000 audit[2945]: SYSCALL arch=c000003e syscall=46 success=yes exit=136 a0=3 a1=7ffd60f73cd0 a2=0 a3=0 items=0 ppid=2932 pid=2945 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:48.427000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Jan 14 01:03:48.428000 audit[2946]: NETFILTER_CFG table=filter:43 family=2 entries=1 op=nft_register_chain pid=2946 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:03:48.428000 audit[2946]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffdec5c9950 a2=0 a3=0 items=0 ppid=2932 pid=2946 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:48.428000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D4649524557414C4C002D740066696C746572 Jan 14 01:03:48.430893 kubelet[2932]: I0114 01:03:48.430856 2932 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Jan 14 01:03:48.433057 kubelet[2932]: I0114 01:03:48.433030 2932 server.go:310] "Adding debug handlers to kubelet server" Jan 14 01:03:48.435579 kubelet[2932]: I0114 01:03:48.435529 2932 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 14 01:03:48.435632 kubelet[2932]: I0114 01:03:48.435593 2932 server_v1.go:49] "podresources" method="list" useActivePods=true Jan 14 01:03:48.435731 kubelet[2932]: I0114 01:03:48.435716 2932 server.go:249] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 14 01:03:48.435913 kubelet[2932]: I0114 01:03:48.435899 2932 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jan 14 01:03:48.438340 kubelet[2932]: I0114 01:03:48.438325 2932 volume_manager.go:313] "Starting Kubelet Volume Manager" Jan 14 01:03:48.438490 kubelet[2932]: E0114 01:03:48.438476 2932 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4547-0-0-n-de0c74fc75\" not found" Jan 14 01:03:48.438000 audit[2950]: NETFILTER_CFG table=filter:44 family=2 entries=2 op=nft_register_chain pid=2950 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:03:48.438000 audit[2950]: SYSCALL arch=c000003e syscall=46 success=yes exit=340 a0=3 a1=7ffe59a055a0 a2=0 a3=0 items=0 ppid=2932 pid=2950 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:48.438000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 14 01:03:48.440719 kubelet[2932]: I0114 01:03:48.440701 2932 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Jan 14 01:03:48.440757 kubelet[2932]: I0114 01:03:48.440742 2932 reconciler.go:29] "Reconciler: start to sync state" Jan 14 01:03:48.440000 audit[2952]: NETFILTER_CFG table=filter:45 family=2 entries=2 op=nft_register_chain pid=2952 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:03:48.440000 audit[2952]: SYSCALL arch=c000003e syscall=46 success=yes exit=340 a0=3 a1=7ffc81294fc0 a2=0 a3=0 items=0 ppid=2932 pid=2952 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:48.440000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 14 01:03:48.446217 kubelet[2932]: E0114 01:03:48.446158 2932 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.21.32:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4547-0-0-n-de0c74fc75?timeout=10s\": dial tcp 10.0.21.32:6443: connect: connection refused" interval="200ms" Jan 14 01:03:48.447975 kubelet[2932]: E0114 01:03:48.446385 2932 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.21.32:6443/api/v1/namespaces/default/events\": dial tcp 10.0.21.32:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4547-0-0-n-de0c74fc75.188a73589fa4be4f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4547-0-0-n-de0c74fc75,UID:ci-4547-0-0-n-de0c74fc75,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4547-0-0-n-de0c74fc75,},FirstTimestamp:2026-01-14 01:03:48.169031247 +0000 UTC m=+1.078313520,LastTimestamp:2026-01-14 01:03:48.169031247 +0000 UTC m=+1.078313520,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4547-0-0-n-de0c74fc75,}" Jan 14 01:03:48.448321 kubelet[2932]: I0114 01:03:48.448298 2932 factory.go:223] Registration of the systemd container factory successfully Jan 14 01:03:48.448379 kubelet[2932]: I0114 01:03:48.448360 2932 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 14 01:03:48.449450 kubelet[2932]: I0114 01:03:48.449438 2932 factory.go:223] Registration of the containerd container factory successfully Jan 14 01:03:48.462263 kubelet[2932]: E0114 01:03:48.460744 2932 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.0.21.32:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.21.32:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Jan 14 01:03:48.467523 kubelet[2932]: I0114 01:03:48.467504 2932 cpu_manager.go:221] "Starting CPU manager" policy="none" Jan 14 01:03:48.467523 kubelet[2932]: I0114 01:03:48.467519 2932 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Jan 14 01:03:48.467620 kubelet[2932]: I0114 01:03:48.467533 2932 state_mem.go:36] "Initialized new in-memory state store" Jan 14 01:03:48.480000 audit[2959]: NETFILTER_CFG table=filter:46 family=2 entries=1 op=nft_register_rule pid=2959 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:03:48.480000 audit[2959]: SYSCALL arch=c000003e syscall=46 success=yes exit=924 a0=3 a1=7ffd8e5236f0 a2=0 a3=0 items=0 ppid=2932 pid=2959 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:48.480000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D41004B5542452D4649524557414C4C002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E7400626C6F636B20696E636F6D696E67206C6F63616C6E657420636F6E6E656374696F6E73002D2D647374003132372E302E302E302F380000002D2D737263003132372E Jan 14 01:03:48.481000 audit[2962]: NETFILTER_CFG table=mangle:47 family=2 entries=1 op=nft_register_chain pid=2962 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:03:48.481000 audit[2962]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffe02a0ea00 a2=0 a3=0 items=0 ppid=2932 pid=2962 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:48.481000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Jan 14 01:03:48.483037 kubelet[2932]: I0114 01:03:48.482032 2932 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Jan 14 01:03:48.482000 audit[2963]: NETFILTER_CFG table=nat:48 family=2 entries=1 op=nft_register_chain pid=2963 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:03:48.482000 audit[2963]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffcfba02270 a2=0 a3=0 items=0 ppid=2932 pid=2963 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:48.482000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Jan 14 01:03:48.483000 audit[2961]: NETFILTER_CFG table=mangle:49 family=10 entries=2 op=nft_register_chain pid=2961 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:03:48.483000 audit[2961]: SYSCALL arch=c000003e syscall=46 success=yes exit=136 a0=3 a1=7ffff32246a0 a2=0 a3=0 items=0 ppid=2932 pid=2961 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:48.483000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Jan 14 01:03:48.485094 kubelet[2932]: I0114 01:03:48.484686 2932 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Jan 14 01:03:48.485094 kubelet[2932]: I0114 01:03:48.484703 2932 status_manager.go:244] "Starting to sync pod status with apiserver" Jan 14 01:03:48.485094 kubelet[2932]: I0114 01:03:48.484723 2932 kubelet.go:2427] "Starting kubelet main sync loop" Jan 14 01:03:48.485094 kubelet[2932]: E0114 01:03:48.484764 2932 kubelet.go:2451] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 14 01:03:48.484000 audit[2964]: NETFILTER_CFG table=filter:50 family=2 entries=1 op=nft_register_chain pid=2964 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:03:48.485920 kubelet[2932]: E0114 01:03:48.485906 2932 reflector.go:205] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.0.21.32:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.21.32:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Jan 14 01:03:48.484000 audit[2964]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffea66fe600 a2=0 a3=0 items=0 ppid=2932 pid=2964 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:48.484000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Jan 14 01:03:48.485000 audit[2965]: NETFILTER_CFG table=mangle:51 family=10 entries=1 op=nft_register_chain pid=2965 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:03:48.485000 audit[2965]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffe18841550 a2=0 a3=0 items=0 ppid=2932 pid=2965 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:48.485000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Jan 14 01:03:48.486000 audit[2968]: NETFILTER_CFG table=nat:52 family=10 entries=1 op=nft_register_chain pid=2968 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:03:48.486000 audit[2968]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffc06585d20 a2=0 a3=0 items=0 ppid=2932 pid=2968 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:48.486000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Jan 14 01:03:48.487000 audit[2969]: NETFILTER_CFG table=filter:53 family=10 entries=1 op=nft_register_chain pid=2969 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:03:48.487000 audit[2969]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffde1c9c640 a2=0 a3=0 items=0 ppid=2932 pid=2969 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:48.487000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Jan 14 01:03:48.539103 kubelet[2932]: E0114 01:03:48.539038 2932 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4547-0-0-n-de0c74fc75\" not found" Jan 14 01:03:48.585819 kubelet[2932]: E0114 01:03:48.585773 2932 kubelet.go:2451] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Jan 14 01:03:48.605534 kubelet[2932]: I0114 01:03:48.605499 2932 policy_none.go:49] "None policy: Start" Jan 14 01:03:48.605534 kubelet[2932]: I0114 01:03:48.605528 2932 memory_manager.go:187] "Starting memorymanager" policy="None" Jan 14 01:03:48.605534 kubelet[2932]: I0114 01:03:48.605540 2932 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Jan 14 01:03:48.639250 kubelet[2932]: E0114 01:03:48.639207 2932 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4547-0-0-n-de0c74fc75\" not found" Jan 14 01:03:48.646890 kubelet[2932]: E0114 01:03:48.646858 2932 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.21.32:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4547-0-0-n-de0c74fc75?timeout=10s\": dial tcp 10.0.21.32:6443: connect: connection refused" interval="400ms" Jan 14 01:03:48.699038 kubelet[2932]: I0114 01:03:48.698908 2932 policy_none.go:47] "Start" Jan 14 01:03:48.703181 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Jan 14 01:03:48.716786 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Jan 14 01:03:48.719858 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Jan 14 01:03:48.737923 kubelet[2932]: E0114 01:03:48.737892 2932 manager.go:513] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Jan 14 01:03:48.738277 kubelet[2932]: I0114 01:03:48.738257 2932 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 14 01:03:48.738321 kubelet[2932]: I0114 01:03:48.738269 2932 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 14 01:03:48.738579 kubelet[2932]: I0114 01:03:48.738512 2932 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 14 01:03:48.740105 kubelet[2932]: E0114 01:03:48.740091 2932 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Jan 14 01:03:48.740153 kubelet[2932]: E0114 01:03:48.740124 2932 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4547-0-0-n-de0c74fc75\" not found" Jan 14 01:03:48.796356 systemd[1]: Created slice kubepods-burstable-pod7089cd88cf737a80dc3e21df3d527fff.slice - libcontainer container kubepods-burstable-pod7089cd88cf737a80dc3e21df3d527fff.slice. Jan 14 01:03:48.802328 kubelet[2932]: E0114 01:03:48.801745 2932 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-n-de0c74fc75\" not found" node="ci-4547-0-0-n-de0c74fc75" Jan 14 01:03:48.806868 systemd[1]: Created slice kubepods-burstable-podfd371944796ececeefd41c943771dfcb.slice - libcontainer container kubepods-burstable-podfd371944796ececeefd41c943771dfcb.slice. Jan 14 01:03:48.815231 kubelet[2932]: E0114 01:03:48.815034 2932 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-n-de0c74fc75\" not found" node="ci-4547-0-0-n-de0c74fc75" Jan 14 01:03:48.817849 systemd[1]: Created slice kubepods-burstable-pode6140c4732fcf35da7d5f7288737d2e0.slice - libcontainer container kubepods-burstable-pode6140c4732fcf35da7d5f7288737d2e0.slice. Jan 14 01:03:48.819371 kubelet[2932]: E0114 01:03:48.819356 2932 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-n-de0c74fc75\" not found" node="ci-4547-0-0-n-de0c74fc75" Jan 14 01:03:48.841188 kubelet[2932]: I0114 01:03:48.840969 2932 kubelet_node_status.go:75] "Attempting to register node" node="ci-4547-0-0-n-de0c74fc75" Jan 14 01:03:48.841271 kubelet[2932]: E0114 01:03:48.841227 2932 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.21.32:6443/api/v1/nodes\": dial tcp 10.0.21.32:6443: connect: connection refused" node="ci-4547-0-0-n-de0c74fc75" Jan 14 01:03:48.842636 kubelet[2932]: I0114 01:03:48.842620 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/fd371944796ececeefd41c943771dfcb-ca-certs\") pod \"kube-controller-manager-ci-4547-0-0-n-de0c74fc75\" (UID: \"fd371944796ececeefd41c943771dfcb\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-n-de0c74fc75" Jan 14 01:03:48.842823 kubelet[2932]: I0114 01:03:48.842640 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/fd371944796ececeefd41c943771dfcb-flexvolume-dir\") pod \"kube-controller-manager-ci-4547-0-0-n-de0c74fc75\" (UID: \"fd371944796ececeefd41c943771dfcb\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-n-de0c74fc75" Jan 14 01:03:48.842823 kubelet[2932]: I0114 01:03:48.842657 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/fd371944796ececeefd41c943771dfcb-kubeconfig\") pod \"kube-controller-manager-ci-4547-0-0-n-de0c74fc75\" (UID: \"fd371944796ececeefd41c943771dfcb\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-n-de0c74fc75" Jan 14 01:03:48.842823 kubelet[2932]: I0114 01:03:48.842674 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/fd371944796ececeefd41c943771dfcb-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4547-0-0-n-de0c74fc75\" (UID: \"fd371944796ececeefd41c943771dfcb\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-n-de0c74fc75" Jan 14 01:03:48.842823 kubelet[2932]: I0114 01:03:48.842690 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/e6140c4732fcf35da7d5f7288737d2e0-kubeconfig\") pod \"kube-scheduler-ci-4547-0-0-n-de0c74fc75\" (UID: \"e6140c4732fcf35da7d5f7288737d2e0\") " pod="kube-system/kube-scheduler-ci-4547-0-0-n-de0c74fc75" Jan 14 01:03:48.842823 kubelet[2932]: I0114 01:03:48.842727 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/7089cd88cf737a80dc3e21df3d527fff-ca-certs\") pod \"kube-apiserver-ci-4547-0-0-n-de0c74fc75\" (UID: \"7089cd88cf737a80dc3e21df3d527fff\") " pod="kube-system/kube-apiserver-ci-4547-0-0-n-de0c74fc75" Jan 14 01:03:48.842934 kubelet[2932]: I0114 01:03:48.842750 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/fd371944796ececeefd41c943771dfcb-k8s-certs\") pod \"kube-controller-manager-ci-4547-0-0-n-de0c74fc75\" (UID: \"fd371944796ececeefd41c943771dfcb\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-n-de0c74fc75" Jan 14 01:03:48.842934 kubelet[2932]: I0114 01:03:48.842767 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/7089cd88cf737a80dc3e21df3d527fff-k8s-certs\") pod \"kube-apiserver-ci-4547-0-0-n-de0c74fc75\" (UID: \"7089cd88cf737a80dc3e21df3d527fff\") " pod="kube-system/kube-apiserver-ci-4547-0-0-n-de0c74fc75" Jan 14 01:03:48.842934 kubelet[2932]: I0114 01:03:48.842781 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/7089cd88cf737a80dc3e21df3d527fff-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4547-0-0-n-de0c74fc75\" (UID: \"7089cd88cf737a80dc3e21df3d527fff\") " pod="kube-system/kube-apiserver-ci-4547-0-0-n-de0c74fc75" Jan 14 01:03:49.043260 kubelet[2932]: I0114 01:03:49.043185 2932 kubelet_node_status.go:75] "Attempting to register node" node="ci-4547-0-0-n-de0c74fc75" Jan 14 01:03:49.043487 kubelet[2932]: E0114 01:03:49.043469 2932 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.21.32:6443/api/v1/nodes\": dial tcp 10.0.21.32:6443: connect: connection refused" node="ci-4547-0-0-n-de0c74fc75" Jan 14 01:03:49.047929 kubelet[2932]: E0114 01:03:49.047901 2932 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.21.32:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4547-0-0-n-de0c74fc75?timeout=10s\": dial tcp 10.0.21.32:6443: connect: connection refused" interval="800ms" Jan 14 01:03:49.107578 containerd[1695]: time="2026-01-14T01:03:49.107436082Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4547-0-0-n-de0c74fc75,Uid:7089cd88cf737a80dc3e21df3d527fff,Namespace:kube-system,Attempt:0,}" Jan 14 01:03:49.118267 containerd[1695]: time="2026-01-14T01:03:49.118140491Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4547-0-0-n-de0c74fc75,Uid:fd371944796ececeefd41c943771dfcb,Namespace:kube-system,Attempt:0,}" Jan 14 01:03:49.122776 containerd[1695]: time="2026-01-14T01:03:49.122624163Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4547-0-0-n-de0c74fc75,Uid:e6140c4732fcf35da7d5f7288737d2e0,Namespace:kube-system,Attempt:0,}" Jan 14 01:03:49.162798 kubelet[2932]: E0114 01:03:49.162755 2932 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.0.21.32:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4547-0-0-n-de0c74fc75&limit=500&resourceVersion=0\": dial tcp 10.0.21.32:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Jan 14 01:03:49.215318 kubelet[2932]: E0114 01:03:49.215288 2932 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.0.21.32:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.21.32:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Jan 14 01:03:49.445491 kubelet[2932]: I0114 01:03:49.445282 2932 kubelet_node_status.go:75] "Attempting to register node" node="ci-4547-0-0-n-de0c74fc75" Jan 14 01:03:49.445599 kubelet[2932]: E0114 01:03:49.445539 2932 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.21.32:6443/api/v1/nodes\": dial tcp 10.0.21.32:6443: connect: connection refused" node="ci-4547-0-0-n-de0c74fc75" Jan 14 01:03:49.700503 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount386366943.mount: Deactivated successfully. Jan 14 01:03:49.711090 containerd[1695]: time="2026-01-14T01:03:49.710699579Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 14 01:03:49.712093 containerd[1695]: time="2026-01-14T01:03:49.712066796Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Jan 14 01:03:49.716005 containerd[1695]: time="2026-01-14T01:03:49.715977609Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 14 01:03:49.718863 containerd[1695]: time="2026-01-14T01:03:49.718835512Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 14 01:03:49.720350 containerd[1695]: time="2026-01-14T01:03:49.720332750Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Jan 14 01:03:49.721537 containerd[1695]: time="2026-01-14T01:03:49.721516293Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 14 01:03:49.723795 containerd[1695]: time="2026-01-14T01:03:49.723765472Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 14 01:03:49.724330 containerd[1695]: time="2026-01-14T01:03:49.724307862Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 597.884578ms" Jan 14 01:03:49.726840 containerd[1695]: time="2026-01-14T01:03:49.726766931Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Jan 14 01:03:49.731635 containerd[1695]: time="2026-01-14T01:03:49.731609999Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 621.196773ms" Jan 14 01:03:49.738097 containerd[1695]: time="2026-01-14T01:03:49.738045282Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 617.15541ms" Jan 14 01:03:49.764042 containerd[1695]: time="2026-01-14T01:03:49.763939542Z" level=info msg="connecting to shim 486f6e84de8abf16564b2e9ece5a1a1fcf44827b575bce934706ac75e82f4740" address="unix:///run/containerd/s/dc4304f06e6fba422e37736d7f49e28eb67a61c4050768112b4d4b4bd5abc2ca" namespace=k8s.io protocol=ttrpc version=3 Jan 14 01:03:49.765921 containerd[1695]: time="2026-01-14T01:03:49.765730840Z" level=info msg="connecting to shim 3c6a4a166e565db90dff8f49bc39c51104f6ab4c30ab86706e6d0af786397933" address="unix:///run/containerd/s/936907b033bbc4179d732c3258832c11640e3f160c62452e7c24fbd168774239" namespace=k8s.io protocol=ttrpc version=3 Jan 14 01:03:49.776505 kubelet[2932]: E0114 01:03:49.776477 2932 reflector.go:205] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.0.21.32:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.21.32:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Jan 14 01:03:49.786254 containerd[1695]: time="2026-01-14T01:03:49.786007967Z" level=info msg="connecting to shim 697c8fa0d6cd8e514cc5b789d61ed016a9cbdccfcde7e58f45d343a0305755ee" address="unix:///run/containerd/s/8a18ef3b0d6599cb82c81ae7085239f9a2bf226ef3c22999d5a97c6fd8ac016e" namespace=k8s.io protocol=ttrpc version=3 Jan 14 01:03:49.797252 systemd[1]: Started cri-containerd-3c6a4a166e565db90dff8f49bc39c51104f6ab4c30ab86706e6d0af786397933.scope - libcontainer container 3c6a4a166e565db90dff8f49bc39c51104f6ab4c30ab86706e6d0af786397933. Jan 14 01:03:49.803532 systemd[1]: Started cri-containerd-486f6e84de8abf16564b2e9ece5a1a1fcf44827b575bce934706ac75e82f4740.scope - libcontainer container 486f6e84de8abf16564b2e9ece5a1a1fcf44827b575bce934706ac75e82f4740. Jan 14 01:03:49.813000 audit: BPF prog-id=87 op=LOAD Jan 14 01:03:49.814000 audit: BPF prog-id=88 op=LOAD Jan 14 01:03:49.814000 audit[3010]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=2993 pid=3010 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:49.814000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3363366134613136366535363564623930646666386634396263333963 Jan 14 01:03:49.814000 audit: BPF prog-id=88 op=UNLOAD Jan 14 01:03:49.814000 audit[3010]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2993 pid=3010 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:49.814000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3363366134613136366535363564623930646666386634396263333963 Jan 14 01:03:49.814000 audit: BPF prog-id=89 op=LOAD Jan 14 01:03:49.814000 audit[3010]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=2993 pid=3010 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:49.814000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3363366134613136366535363564623930646666386634396263333963 Jan 14 01:03:49.815000 audit: BPF prog-id=90 op=LOAD Jan 14 01:03:49.815000 audit[3010]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=2993 pid=3010 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:49.815000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3363366134613136366535363564623930646666386634396263333963 Jan 14 01:03:49.815000 audit: BPF prog-id=90 op=UNLOAD Jan 14 01:03:49.815000 audit[3010]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2993 pid=3010 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:49.815000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3363366134613136366535363564623930646666386634396263333963 Jan 14 01:03:49.815000 audit: BPF prog-id=89 op=UNLOAD Jan 14 01:03:49.815000 audit[3010]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2993 pid=3010 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:49.815000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3363366134613136366535363564623930646666386634396263333963 Jan 14 01:03:49.815000 audit: BPF prog-id=91 op=LOAD Jan 14 01:03:49.815000 audit[3010]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=2993 pid=3010 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:49.815000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3363366134613136366535363564623930646666386634396263333963 Jan 14 01:03:49.824208 systemd[1]: Started cri-containerd-697c8fa0d6cd8e514cc5b789d61ed016a9cbdccfcde7e58f45d343a0305755ee.scope - libcontainer container 697c8fa0d6cd8e514cc5b789d61ed016a9cbdccfcde7e58f45d343a0305755ee. Jan 14 01:03:49.825000 audit: BPF prog-id=92 op=LOAD Jan 14 01:03:49.826000 audit: BPF prog-id=93 op=LOAD Jan 14 01:03:49.826000 audit[3016]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001a8238 a2=98 a3=0 items=0 ppid=2989 pid=3016 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:49.826000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3438366636653834646538616266313635363462326539656365356131 Jan 14 01:03:49.826000 audit: BPF prog-id=93 op=UNLOAD Jan 14 01:03:49.826000 audit[3016]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2989 pid=3016 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:49.826000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3438366636653834646538616266313635363462326539656365356131 Jan 14 01:03:49.826000 audit: BPF prog-id=94 op=LOAD Jan 14 01:03:49.826000 audit[3016]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001a8488 a2=98 a3=0 items=0 ppid=2989 pid=3016 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:49.826000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3438366636653834646538616266313635363462326539656365356131 Jan 14 01:03:49.826000 audit: BPF prog-id=95 op=LOAD Jan 14 01:03:49.826000 audit[3016]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c0001a8218 a2=98 a3=0 items=0 ppid=2989 pid=3016 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:49.826000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3438366636653834646538616266313635363462326539656365356131 Jan 14 01:03:49.826000 audit: BPF prog-id=95 op=UNLOAD Jan 14 01:03:49.826000 audit[3016]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=2989 pid=3016 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:49.826000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3438366636653834646538616266313635363462326539656365356131 Jan 14 01:03:49.826000 audit: BPF prog-id=94 op=UNLOAD Jan 14 01:03:49.826000 audit[3016]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2989 pid=3016 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:49.826000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3438366636653834646538616266313635363462326539656365356131 Jan 14 01:03:49.826000 audit: BPF prog-id=96 op=LOAD Jan 14 01:03:49.826000 audit[3016]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001a86e8 a2=98 a3=0 items=0 ppid=2989 pid=3016 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:49.826000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3438366636653834646538616266313635363462326539656365356131 Jan 14 01:03:49.836000 audit: BPF prog-id=97 op=LOAD Jan 14 01:03:49.836000 audit: BPF prog-id=98 op=LOAD Jan 14 01:03:49.836000 audit[3056]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=3035 pid=3056 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:49.836000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3639376338666130643663643865353134636335623738396436316564 Jan 14 01:03:49.837000 audit: BPF prog-id=98 op=UNLOAD Jan 14 01:03:49.837000 audit[3056]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3035 pid=3056 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:49.837000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3639376338666130643663643865353134636335623738396436316564 Jan 14 01:03:49.837000 audit: BPF prog-id=99 op=LOAD Jan 14 01:03:49.837000 audit[3056]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=3035 pid=3056 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:49.837000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3639376338666130643663643865353134636335623738396436316564 Jan 14 01:03:49.837000 audit: BPF prog-id=100 op=LOAD Jan 14 01:03:49.837000 audit[3056]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=3035 pid=3056 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:49.837000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3639376338666130643663643865353134636335623738396436316564 Jan 14 01:03:49.837000 audit: BPF prog-id=100 op=UNLOAD Jan 14 01:03:49.837000 audit[3056]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3035 pid=3056 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:49.837000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3639376338666130643663643865353134636335623738396436316564 Jan 14 01:03:49.837000 audit: BPF prog-id=99 op=UNLOAD Jan 14 01:03:49.837000 audit[3056]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3035 pid=3056 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:49.837000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3639376338666130643663643865353134636335623738396436316564 Jan 14 01:03:49.837000 audit: BPF prog-id=101 op=LOAD Jan 14 01:03:49.837000 audit[3056]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=3035 pid=3056 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:49.837000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3639376338666130643663643865353134636335623738396436316564 Jan 14 01:03:49.850340 kubelet[2932]: E0114 01:03:49.850201 2932 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.21.32:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4547-0-0-n-de0c74fc75?timeout=10s\": dial tcp 10.0.21.32:6443: connect: connection refused" interval="1.6s" Jan 14 01:03:49.870716 containerd[1695]: time="2026-01-14T01:03:49.870584716Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4547-0-0-n-de0c74fc75,Uid:7089cd88cf737a80dc3e21df3d527fff,Namespace:kube-system,Attempt:0,} returns sandbox id \"3c6a4a166e565db90dff8f49bc39c51104f6ab4c30ab86706e6d0af786397933\"" Jan 14 01:03:49.883853 containerd[1695]: time="2026-01-14T01:03:49.883041284Z" level=info msg="CreateContainer within sandbox \"3c6a4a166e565db90dff8f49bc39c51104f6ab4c30ab86706e6d0af786397933\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Jan 14 01:03:49.893951 containerd[1695]: time="2026-01-14T01:03:49.893925094Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4547-0-0-n-de0c74fc75,Uid:e6140c4732fcf35da7d5f7288737d2e0,Namespace:kube-system,Attempt:0,} returns sandbox id \"486f6e84de8abf16564b2e9ece5a1a1fcf44827b575bce934706ac75e82f4740\"" Jan 14 01:03:49.896403 containerd[1695]: time="2026-01-14T01:03:49.896380160Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4547-0-0-n-de0c74fc75,Uid:fd371944796ececeefd41c943771dfcb,Namespace:kube-system,Attempt:0,} returns sandbox id \"697c8fa0d6cd8e514cc5b789d61ed016a9cbdccfcde7e58f45d343a0305755ee\"" Jan 14 01:03:49.899145 containerd[1695]: time="2026-01-14T01:03:49.899112080Z" level=info msg="CreateContainer within sandbox \"486f6e84de8abf16564b2e9ece5a1a1fcf44827b575bce934706ac75e82f4740\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Jan 14 01:03:49.899371 containerd[1695]: time="2026-01-14T01:03:49.899285341Z" level=info msg="Container 7616b4921adf26bfff55412882d6e7056afa24eda596ad1ec8f967c2e11215a9: CDI devices from CRI Config.CDIDevices: []" Jan 14 01:03:49.901195 containerd[1695]: time="2026-01-14T01:03:49.901172668Z" level=info msg="CreateContainer within sandbox \"697c8fa0d6cd8e514cc5b789d61ed016a9cbdccfcde7e58f45d343a0305755ee\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Jan 14 01:03:49.913520 containerd[1695]: time="2026-01-14T01:03:49.913448504Z" level=info msg="CreateContainer within sandbox \"3c6a4a166e565db90dff8f49bc39c51104f6ab4c30ab86706e6d0af786397933\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"7616b4921adf26bfff55412882d6e7056afa24eda596ad1ec8f967c2e11215a9\"" Jan 14 01:03:49.914103 containerd[1695]: time="2026-01-14T01:03:49.914086984Z" level=info msg="StartContainer for \"7616b4921adf26bfff55412882d6e7056afa24eda596ad1ec8f967c2e11215a9\"" Jan 14 01:03:49.915170 containerd[1695]: time="2026-01-14T01:03:49.915153896Z" level=info msg="connecting to shim 7616b4921adf26bfff55412882d6e7056afa24eda596ad1ec8f967c2e11215a9" address="unix:///run/containerd/s/936907b033bbc4179d732c3258832c11640e3f160c62452e7c24fbd168774239" protocol=ttrpc version=3 Jan 14 01:03:49.918687 containerd[1695]: time="2026-01-14T01:03:49.918665804Z" level=info msg="Container 2cf0ce44d3607542dcd9098ffb524edc9046a98eb65edf65a62064d187fc264f: CDI devices from CRI Config.CDIDevices: []" Jan 14 01:03:49.922224 containerd[1695]: time="2026-01-14T01:03:49.922086307Z" level=info msg="Container 45f137af4154ccd9fc04c85cc2a517e9d57dbc0cfe10389b07cb49e6575117a8: CDI devices from CRI Config.CDIDevices: []" Jan 14 01:03:49.928397 containerd[1695]: time="2026-01-14T01:03:49.928372525Z" level=info msg="CreateContainer within sandbox \"486f6e84de8abf16564b2e9ece5a1a1fcf44827b575bce934706ac75e82f4740\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"2cf0ce44d3607542dcd9098ffb524edc9046a98eb65edf65a62064d187fc264f\"" Jan 14 01:03:49.930073 containerd[1695]: time="2026-01-14T01:03:49.929927511Z" level=info msg="StartContainer for \"2cf0ce44d3607542dcd9098ffb524edc9046a98eb65edf65a62064d187fc264f\"" Jan 14 01:03:49.932409 containerd[1695]: time="2026-01-14T01:03:49.932387331Z" level=info msg="connecting to shim 2cf0ce44d3607542dcd9098ffb524edc9046a98eb65edf65a62064d187fc264f" address="unix:///run/containerd/s/dc4304f06e6fba422e37736d7f49e28eb67a61c4050768112b4d4b4bd5abc2ca" protocol=ttrpc version=3 Jan 14 01:03:49.933243 containerd[1695]: time="2026-01-14T01:03:49.933210548Z" level=info msg="CreateContainer within sandbox \"697c8fa0d6cd8e514cc5b789d61ed016a9cbdccfcde7e58f45d343a0305755ee\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"45f137af4154ccd9fc04c85cc2a517e9d57dbc0cfe10389b07cb49e6575117a8\"" Jan 14 01:03:49.933677 containerd[1695]: time="2026-01-14T01:03:49.933599975Z" level=info msg="StartContainer for \"45f137af4154ccd9fc04c85cc2a517e9d57dbc0cfe10389b07cb49e6575117a8\"" Jan 14 01:03:49.934724 containerd[1695]: time="2026-01-14T01:03:49.934705061Z" level=info msg="connecting to shim 45f137af4154ccd9fc04c85cc2a517e9d57dbc0cfe10389b07cb49e6575117a8" address="unix:///run/containerd/s/8a18ef3b0d6599cb82c81ae7085239f9a2bf226ef3c22999d5a97c6fd8ac016e" protocol=ttrpc version=3 Jan 14 01:03:49.935318 systemd[1]: Started cri-containerd-7616b4921adf26bfff55412882d6e7056afa24eda596ad1ec8f967c2e11215a9.scope - libcontainer container 7616b4921adf26bfff55412882d6e7056afa24eda596ad1ec8f967c2e11215a9. Jan 14 01:03:49.954184 systemd[1]: Started cri-containerd-45f137af4154ccd9fc04c85cc2a517e9d57dbc0cfe10389b07cb49e6575117a8.scope - libcontainer container 45f137af4154ccd9fc04c85cc2a517e9d57dbc0cfe10389b07cb49e6575117a8. Jan 14 01:03:49.957144 systemd[1]: Started cri-containerd-2cf0ce44d3607542dcd9098ffb524edc9046a98eb65edf65a62064d187fc264f.scope - libcontainer container 2cf0ce44d3607542dcd9098ffb524edc9046a98eb65edf65a62064d187fc264f. Jan 14 01:03:49.958000 audit: BPF prog-id=102 op=LOAD Jan 14 01:03:49.959000 audit: BPF prog-id=103 op=LOAD Jan 14 01:03:49.959000 audit[3116]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000186238 a2=98 a3=0 items=0 ppid=2993 pid=3116 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:49.959000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3736313662343932316164663236626666663535343132383832643665 Jan 14 01:03:49.960000 audit: BPF prog-id=103 op=UNLOAD Jan 14 01:03:49.960000 audit[3116]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2993 pid=3116 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:49.960000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3736313662343932316164663236626666663535343132383832643665 Jan 14 01:03:49.960000 audit: BPF prog-id=104 op=LOAD Jan 14 01:03:49.960000 audit[3116]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000186488 a2=98 a3=0 items=0 ppid=2993 pid=3116 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:49.960000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3736313662343932316164663236626666663535343132383832643665 Jan 14 01:03:49.960000 audit: BPF prog-id=105 op=LOAD Jan 14 01:03:49.960000 audit[3116]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000186218 a2=98 a3=0 items=0 ppid=2993 pid=3116 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:49.960000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3736313662343932316164663236626666663535343132383832643665 Jan 14 01:03:49.960000 audit: BPF prog-id=105 op=UNLOAD Jan 14 01:03:49.960000 audit[3116]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2993 pid=3116 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:49.960000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3736313662343932316164663236626666663535343132383832643665 Jan 14 01:03:49.960000 audit: BPF prog-id=104 op=UNLOAD Jan 14 01:03:49.960000 audit[3116]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2993 pid=3116 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:49.960000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3736313662343932316164663236626666663535343132383832643665 Jan 14 01:03:49.960000 audit: BPF prog-id=106 op=LOAD Jan 14 01:03:49.960000 audit[3116]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001866e8 a2=98 a3=0 items=0 ppid=2993 pid=3116 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:49.960000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3736313662343932316164663236626666663535343132383832643665 Jan 14 01:03:49.970000 audit: BPF prog-id=107 op=LOAD Jan 14 01:03:49.970000 audit: BPF prog-id=108 op=LOAD Jan 14 01:03:49.970000 audit[3130]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000128238 a2=98 a3=0 items=0 ppid=3035 pid=3130 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:49.970000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3435663133376166343135346363643966633034633835636332613531 Jan 14 01:03:49.970000 audit: BPF prog-id=108 op=UNLOAD Jan 14 01:03:49.970000 audit[3130]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3035 pid=3130 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:49.970000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3435663133376166343135346363643966633034633835636332613531 Jan 14 01:03:49.970000 audit: BPF prog-id=109 op=LOAD Jan 14 01:03:49.970000 audit[3130]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000128488 a2=98 a3=0 items=0 ppid=3035 pid=3130 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:49.970000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3435663133376166343135346363643966633034633835636332613531 Jan 14 01:03:49.970000 audit: BPF prog-id=110 op=LOAD Jan 14 01:03:49.970000 audit[3130]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000128218 a2=98 a3=0 items=0 ppid=3035 pid=3130 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:49.970000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3435663133376166343135346363643966633034633835636332613531 Jan 14 01:03:49.970000 audit: BPF prog-id=110 op=UNLOAD Jan 14 01:03:49.970000 audit[3130]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3035 pid=3130 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:49.970000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3435663133376166343135346363643966633034633835636332613531 Jan 14 01:03:49.970000 audit: BPF prog-id=109 op=UNLOAD Jan 14 01:03:49.970000 audit[3130]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3035 pid=3130 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:49.970000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3435663133376166343135346363643966633034633835636332613531 Jan 14 01:03:49.970000 audit: BPF prog-id=111 op=LOAD Jan 14 01:03:49.970000 audit[3130]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001286e8 a2=98 a3=0 items=0 ppid=3035 pid=3130 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:49.970000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3435663133376166343135346363643966633034633835636332613531 Jan 14 01:03:49.990000 audit: BPF prog-id=112 op=LOAD Jan 14 01:03:49.991000 audit: BPF prog-id=113 op=LOAD Jan 14 01:03:49.991000 audit[3128]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000220238 a2=98 a3=0 items=0 ppid=2989 pid=3128 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:49.991000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3263663063653434643336303735343264636439303938666662353234 Jan 14 01:03:49.991000 audit: BPF prog-id=113 op=UNLOAD Jan 14 01:03:49.991000 audit[3128]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2989 pid=3128 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:49.991000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3263663063653434643336303735343264636439303938666662353234 Jan 14 01:03:49.992000 audit: BPF prog-id=114 op=LOAD Jan 14 01:03:49.992000 audit[3128]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000220488 a2=98 a3=0 items=0 ppid=2989 pid=3128 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:49.992000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3263663063653434643336303735343264636439303938666662353234 Jan 14 01:03:49.992000 audit: BPF prog-id=115 op=LOAD Jan 14 01:03:49.992000 audit[3128]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000220218 a2=98 a3=0 items=0 ppid=2989 pid=3128 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:49.992000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3263663063653434643336303735343264636439303938666662353234 Jan 14 01:03:49.993000 audit: BPF prog-id=115 op=UNLOAD Jan 14 01:03:49.993000 audit[3128]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2989 pid=3128 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:49.993000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3263663063653434643336303735343264636439303938666662353234 Jan 14 01:03:49.993000 audit: BPF prog-id=114 op=UNLOAD Jan 14 01:03:49.993000 audit[3128]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2989 pid=3128 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:49.993000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3263663063653434643336303735343264636439303938666662353234 Jan 14 01:03:49.993000 audit: BPF prog-id=116 op=LOAD Jan 14 01:03:49.993000 audit[3128]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0002206e8 a2=98 a3=0 items=0 ppid=2989 pid=3128 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:03:49.993000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3263663063653434643336303735343264636439303938666662353234 Jan 14 01:03:50.018934 containerd[1695]: time="2026-01-14T01:03:50.018892025Z" level=info msg="StartContainer for \"7616b4921adf26bfff55412882d6e7056afa24eda596ad1ec8f967c2e11215a9\" returns successfully" Jan 14 01:03:50.022629 containerd[1695]: time="2026-01-14T01:03:50.022603326Z" level=info msg="StartContainer for \"45f137af4154ccd9fc04c85cc2a517e9d57dbc0cfe10389b07cb49e6575117a8\" returns successfully" Jan 14 01:03:50.050968 kubelet[2932]: E0114 01:03:50.050933 2932 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.0.21.32:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.21.32:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Jan 14 01:03:50.085202 containerd[1695]: time="2026-01-14T01:03:50.085172354Z" level=info msg="StartContainer for \"2cf0ce44d3607542dcd9098ffb524edc9046a98eb65edf65a62064d187fc264f\" returns successfully" Jan 14 01:03:50.248901 kubelet[2932]: I0114 01:03:50.248369 2932 kubelet_node_status.go:75] "Attempting to register node" node="ci-4547-0-0-n-de0c74fc75" Jan 14 01:03:50.494645 kubelet[2932]: E0114 01:03:50.494622 2932 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-n-de0c74fc75\" not found" node="ci-4547-0-0-n-de0c74fc75" Jan 14 01:03:50.498116 kubelet[2932]: E0114 01:03:50.498097 2932 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-n-de0c74fc75\" not found" node="ci-4547-0-0-n-de0c74fc75" Jan 14 01:03:50.499625 kubelet[2932]: E0114 01:03:50.499566 2932 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-n-de0c74fc75\" not found" node="ci-4547-0-0-n-de0c74fc75" Jan 14 01:03:51.454315 kubelet[2932]: E0114 01:03:51.454275 2932 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4547-0-0-n-de0c74fc75\" not found" node="ci-4547-0-0-n-de0c74fc75" Jan 14 01:03:51.501480 kubelet[2932]: E0114 01:03:51.501437 2932 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-n-de0c74fc75\" not found" node="ci-4547-0-0-n-de0c74fc75" Jan 14 01:03:51.502109 kubelet[2932]: E0114 01:03:51.502096 2932 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-n-de0c74fc75\" not found" node="ci-4547-0-0-n-de0c74fc75" Jan 14 01:03:51.549913 kubelet[2932]: I0114 01:03:51.549695 2932 kubelet_node_status.go:78] "Successfully registered node" node="ci-4547-0-0-n-de0c74fc75" Jan 14 01:03:51.639739 kubelet[2932]: I0114 01:03:51.639705 2932 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4547-0-0-n-de0c74fc75" Jan 14 01:03:51.648524 kubelet[2932]: E0114 01:03:51.648457 2932 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4547-0-0-n-de0c74fc75\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4547-0-0-n-de0c74fc75" Jan 14 01:03:51.648524 kubelet[2932]: I0114 01:03:51.648480 2932 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4547-0-0-n-de0c74fc75" Jan 14 01:03:51.650559 kubelet[2932]: E0114 01:03:51.650512 2932 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4547-0-0-n-de0c74fc75\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4547-0-0-n-de0c74fc75" Jan 14 01:03:51.650559 kubelet[2932]: I0114 01:03:51.650529 2932 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4547-0-0-n-de0c74fc75" Jan 14 01:03:51.652994 kubelet[2932]: E0114 01:03:51.652972 2932 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4547-0-0-n-de0c74fc75\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4547-0-0-n-de0c74fc75" Jan 14 01:03:52.077321 kubelet[2932]: I0114 01:03:52.077088 2932 apiserver.go:52] "Watching apiserver" Jan 14 01:03:52.141819 kubelet[2932]: I0114 01:03:52.141786 2932 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Jan 14 01:03:53.264491 systemd[1]: Reload requested from client PID 3219 ('systemctl') (unit session-8.scope)... Jan 14 01:03:53.264753 systemd[1]: Reloading... Jan 14 01:03:53.351069 zram_generator::config[3261]: No configuration found. Jan 14 01:03:53.546531 systemd[1]: Reloading finished in 281 ms. Jan 14 01:03:53.578294 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Jan 14 01:03:53.590475 systemd[1]: kubelet.service: Deactivated successfully. Jan 14 01:03:53.590700 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 14 01:03:53.593166 kernel: kauditd_printk_skb: 202 callbacks suppressed Jan 14 01:03:53.593206 kernel: audit: type=1131 audit(1768352633.589:411): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:03:53.589000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:03:53.590757 systemd[1]: kubelet.service: Consumed 961ms CPU time, 124.6M memory peak. Jan 14 01:03:53.596459 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 14 01:03:53.597000 audit: BPF prog-id=117 op=LOAD Jan 14 01:03:53.600592 kernel: audit: type=1334 audit(1768352633.597:412): prog-id=117 op=LOAD Jan 14 01:03:53.600642 kernel: audit: type=1334 audit(1768352633.597:413): prog-id=79 op=UNLOAD Jan 14 01:03:53.597000 audit: BPF prog-id=79 op=UNLOAD Jan 14 01:03:53.601734 kernel: audit: type=1334 audit(1768352633.597:414): prog-id=118 op=LOAD Jan 14 01:03:53.597000 audit: BPF prog-id=118 op=LOAD Jan 14 01:03:53.602822 kernel: audit: type=1334 audit(1768352633.597:415): prog-id=80 op=UNLOAD Jan 14 01:03:53.597000 audit: BPF prog-id=80 op=UNLOAD Jan 14 01:03:53.603863 kernel: audit: type=1334 audit(1768352633.598:416): prog-id=119 op=LOAD Jan 14 01:03:53.598000 audit: BPF prog-id=119 op=LOAD Jan 14 01:03:53.604920 kernel: audit: type=1334 audit(1768352633.598:417): prog-id=71 op=UNLOAD Jan 14 01:03:53.598000 audit: BPF prog-id=71 op=UNLOAD Jan 14 01:03:53.605988 kernel: audit: type=1334 audit(1768352633.598:418): prog-id=120 op=LOAD Jan 14 01:03:53.598000 audit: BPF prog-id=120 op=LOAD Jan 14 01:03:53.598000 audit: BPF prog-id=121 op=LOAD Jan 14 01:03:53.598000 audit: BPF prog-id=72 op=UNLOAD Jan 14 01:03:53.607304 kernel: audit: type=1334 audit(1768352633.598:419): prog-id=121 op=LOAD Jan 14 01:03:53.607355 kernel: audit: type=1334 audit(1768352633.598:420): prog-id=72 op=UNLOAD Jan 14 01:03:53.598000 audit: BPF prog-id=73 op=UNLOAD Jan 14 01:03:53.600000 audit: BPF prog-id=122 op=LOAD Jan 14 01:03:53.600000 audit: BPF prog-id=70 op=UNLOAD Jan 14 01:03:53.600000 audit: BPF prog-id=123 op=LOAD Jan 14 01:03:53.600000 audit: BPF prog-id=81 op=UNLOAD Jan 14 01:03:53.600000 audit: BPF prog-id=124 op=LOAD Jan 14 01:03:53.600000 audit: BPF prog-id=125 op=LOAD Jan 14 01:03:53.600000 audit: BPF prog-id=82 op=UNLOAD Jan 14 01:03:53.600000 audit: BPF prog-id=83 op=UNLOAD Jan 14 01:03:53.602000 audit: BPF prog-id=126 op=LOAD Jan 14 01:03:53.602000 audit: BPF prog-id=127 op=LOAD Jan 14 01:03:53.602000 audit: BPF prog-id=77 op=UNLOAD Jan 14 01:03:53.602000 audit: BPF prog-id=78 op=UNLOAD Jan 14 01:03:53.603000 audit: BPF prog-id=128 op=LOAD Jan 14 01:03:53.603000 audit: BPF prog-id=74 op=UNLOAD Jan 14 01:03:53.603000 audit: BPF prog-id=129 op=LOAD Jan 14 01:03:53.603000 audit: BPF prog-id=130 op=LOAD Jan 14 01:03:53.603000 audit: BPF prog-id=75 op=UNLOAD Jan 14 01:03:53.603000 audit: BPF prog-id=76 op=UNLOAD Jan 14 01:03:53.608000 audit: BPF prog-id=131 op=LOAD Jan 14 01:03:53.608000 audit: BPF prog-id=84 op=UNLOAD Jan 14 01:03:53.608000 audit: BPF prog-id=132 op=LOAD Jan 14 01:03:53.608000 audit: BPF prog-id=133 op=LOAD Jan 14 01:03:53.608000 audit: BPF prog-id=85 op=UNLOAD Jan 14 01:03:53.609000 audit: BPF prog-id=86 op=UNLOAD Jan 14 01:03:53.610000 audit: BPF prog-id=134 op=LOAD Jan 14 01:03:53.610000 audit: BPF prog-id=67 op=UNLOAD Jan 14 01:03:53.610000 audit: BPF prog-id=135 op=LOAD Jan 14 01:03:53.610000 audit: BPF prog-id=136 op=LOAD Jan 14 01:03:53.610000 audit: BPF prog-id=68 op=UNLOAD Jan 14 01:03:53.610000 audit: BPF prog-id=69 op=UNLOAD Jan 14 01:03:53.730000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:03:53.730626 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 14 01:03:53.742636 (kubelet)[3316]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 14 01:03:53.788559 kubelet[3316]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Jan 14 01:03:53.788559 kubelet[3316]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 14 01:03:53.789069 kubelet[3316]: I0114 01:03:53.788586 3316 server.go:213] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 14 01:03:53.797119 kubelet[3316]: I0114 01:03:53.796495 3316 server.go:529] "Kubelet version" kubeletVersion="v1.34.1" Jan 14 01:03:53.797119 kubelet[3316]: I0114 01:03:53.796513 3316 server.go:531] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 14 01:03:53.797119 kubelet[3316]: I0114 01:03:53.796537 3316 watchdog_linux.go:95] "Systemd watchdog is not enabled" Jan 14 01:03:53.797119 kubelet[3316]: I0114 01:03:53.796542 3316 watchdog_linux.go:137] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Jan 14 01:03:53.797119 kubelet[3316]: I0114 01:03:53.796725 3316 server.go:956] "Client rotation is on, will bootstrap in background" Jan 14 01:03:53.799094 kubelet[3316]: I0114 01:03:53.798425 3316 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Jan 14 01:03:53.803461 kubelet[3316]: I0114 01:03:53.803150 3316 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 14 01:03:53.807227 kubelet[3316]: I0114 01:03:53.807205 3316 server.go:1423] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 14 01:03:53.810886 kubelet[3316]: I0114 01:03:53.810575 3316 server.go:781] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Jan 14 01:03:53.810886 kubelet[3316]: I0114 01:03:53.810774 3316 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 14 01:03:53.810985 kubelet[3316]: I0114 01:03:53.810791 3316 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4547-0-0-n-de0c74fc75","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 14 01:03:53.810985 kubelet[3316]: I0114 01:03:53.810984 3316 topology_manager.go:138] "Creating topology manager with none policy" Jan 14 01:03:53.811089 kubelet[3316]: I0114 01:03:53.810992 3316 container_manager_linux.go:306] "Creating device plugin manager" Jan 14 01:03:53.811089 kubelet[3316]: I0114 01:03:53.811013 3316 container_manager_linux.go:315] "Creating Dynamic Resource Allocation (DRA) manager" Jan 14 01:03:53.812432 kubelet[3316]: I0114 01:03:53.812351 3316 state_mem.go:36] "Initialized new in-memory state store" Jan 14 01:03:53.812589 kubelet[3316]: I0114 01:03:53.812581 3316 kubelet.go:475] "Attempting to sync node with API server" Jan 14 01:03:53.812617 kubelet[3316]: I0114 01:03:53.812594 3316 kubelet.go:376] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 14 01:03:53.812617 kubelet[3316]: I0114 01:03:53.812614 3316 kubelet.go:387] "Adding apiserver pod source" Jan 14 01:03:53.812660 kubelet[3316]: I0114 01:03:53.812626 3316 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 14 01:03:53.814383 kubelet[3316]: I0114 01:03:53.814371 3316 kuberuntime_manager.go:291] "Container runtime initialized" containerRuntime="containerd" version="v2.1.5" apiVersion="v1" Jan 14 01:03:53.815139 kubelet[3316]: I0114 01:03:53.815040 3316 kubelet.go:940] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Jan 14 01:03:53.815139 kubelet[3316]: I0114 01:03:53.815085 3316 kubelet.go:964] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Jan 14 01:03:53.821078 kubelet[3316]: I0114 01:03:53.821061 3316 server.go:1262] "Started kubelet" Jan 14 01:03:53.822647 kubelet[3316]: I0114 01:03:53.822635 3316 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 14 01:03:53.832585 kubelet[3316]: I0114 01:03:53.832394 3316 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Jan 14 01:03:53.834491 kubelet[3316]: I0114 01:03:53.834473 3316 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jan 14 01:03:53.835207 kubelet[3316]: I0114 01:03:53.835137 3316 volume_manager.go:313] "Starting Kubelet Volume Manager" Jan 14 01:03:53.835893 kubelet[3316]: I0114 01:03:53.835883 3316 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Jan 14 01:03:53.836158 kubelet[3316]: I0114 01:03:53.836045 3316 reconciler.go:29] "Reconciler: start to sync state" Jan 14 01:03:53.836895 kubelet[3316]: I0114 01:03:53.836880 3316 server.go:310] "Adding debug handlers to kubelet server" Jan 14 01:03:53.840576 kubelet[3316]: I0114 01:03:53.840402 3316 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 14 01:03:53.840576 kubelet[3316]: I0114 01:03:53.840437 3316 server_v1.go:49] "podresources" method="list" useActivePods=true Jan 14 01:03:53.840576 kubelet[3316]: I0114 01:03:53.840554 3316 server.go:249] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 14 01:03:53.841671 kubelet[3316]: E0114 01:03:53.841656 3316 kubelet.go:1615] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jan 14 01:03:53.843442 kubelet[3316]: I0114 01:03:53.843431 3316 factory.go:223] Registration of the systemd container factory successfully Jan 14 01:03:53.843570 kubelet[3316]: I0114 01:03:53.843557 3316 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 14 01:03:53.845883 kubelet[3316]: I0114 01:03:53.845522 3316 factory.go:223] Registration of the containerd container factory successfully Jan 14 01:03:53.853292 kubelet[3316]: I0114 01:03:53.853249 3316 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Jan 14 01:03:53.854230 kubelet[3316]: I0114 01:03:53.854211 3316 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Jan 14 01:03:53.854230 kubelet[3316]: I0114 01:03:53.854225 3316 status_manager.go:244] "Starting to sync pod status with apiserver" Jan 14 01:03:53.854303 kubelet[3316]: I0114 01:03:53.854243 3316 kubelet.go:2427] "Starting kubelet main sync loop" Jan 14 01:03:53.854303 kubelet[3316]: E0114 01:03:53.854276 3316 kubelet.go:2451] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 14 01:03:53.903200 kubelet[3316]: I0114 01:03:53.903025 3316 cpu_manager.go:221] "Starting CPU manager" policy="none" Jan 14 01:03:53.903389 kubelet[3316]: I0114 01:03:53.903336 3316 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Jan 14 01:03:53.903389 kubelet[3316]: I0114 01:03:53.903355 3316 state_mem.go:36] "Initialized new in-memory state store" Jan 14 01:03:53.903599 kubelet[3316]: I0114 01:03:53.903590 3316 state_mem.go:88] "Updated default CPUSet" cpuSet="" Jan 14 01:03:53.903661 kubelet[3316]: I0114 01:03:53.903644 3316 state_mem.go:96] "Updated CPUSet assignments" assignments={} Jan 14 01:03:53.903746 kubelet[3316]: I0114 01:03:53.903735 3316 policy_none.go:49] "None policy: Start" Jan 14 01:03:53.903807 kubelet[3316]: I0114 01:03:53.903789 3316 memory_manager.go:187] "Starting memorymanager" policy="None" Jan 14 01:03:53.903856 kubelet[3316]: I0114 01:03:53.903846 3316 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Jan 14 01:03:53.904270 kubelet[3316]: I0114 01:03:53.903966 3316 state_mem.go:77] "Updated machine memory state" logger="Memory Manager state checkpoint" Jan 14 01:03:53.904270 kubelet[3316]: I0114 01:03:53.903972 3316 policy_none.go:47] "Start" Jan 14 01:03:53.907901 kubelet[3316]: E0114 01:03:53.907884 3316 manager.go:513] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Jan 14 01:03:53.908027 kubelet[3316]: I0114 01:03:53.908016 3316 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 14 01:03:53.908102 kubelet[3316]: I0114 01:03:53.908028 3316 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 14 01:03:53.908473 kubelet[3316]: I0114 01:03:53.908462 3316 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 14 01:03:53.913065 kubelet[3316]: E0114 01:03:53.913025 3316 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Jan 14 01:03:53.955919 kubelet[3316]: I0114 01:03:53.955893 3316 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4547-0-0-n-de0c74fc75" Jan 14 01:03:53.956282 kubelet[3316]: I0114 01:03:53.955894 3316 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4547-0-0-n-de0c74fc75" Jan 14 01:03:53.956430 kubelet[3316]: I0114 01:03:53.956416 3316 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4547-0-0-n-de0c74fc75" Jan 14 01:03:54.015482 kubelet[3316]: I0114 01:03:54.015439 3316 kubelet_node_status.go:75] "Attempting to register node" node="ci-4547-0-0-n-de0c74fc75" Jan 14 01:03:54.024114 kubelet[3316]: I0114 01:03:54.024042 3316 kubelet_node_status.go:124] "Node was previously registered" node="ci-4547-0-0-n-de0c74fc75" Jan 14 01:03:54.024315 kubelet[3316]: I0114 01:03:54.024270 3316 kubelet_node_status.go:78] "Successfully registered node" node="ci-4547-0-0-n-de0c74fc75" Jan 14 01:03:54.136486 kubelet[3316]: I0114 01:03:54.136460 3316 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/fd371944796ececeefd41c943771dfcb-ca-certs\") pod \"kube-controller-manager-ci-4547-0-0-n-de0c74fc75\" (UID: \"fd371944796ececeefd41c943771dfcb\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-n-de0c74fc75" Jan 14 01:03:54.136486 kubelet[3316]: I0114 01:03:54.136486 3316 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/fd371944796ececeefd41c943771dfcb-kubeconfig\") pod \"kube-controller-manager-ci-4547-0-0-n-de0c74fc75\" (UID: \"fd371944796ececeefd41c943771dfcb\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-n-de0c74fc75" Jan 14 01:03:54.136617 kubelet[3316]: I0114 01:03:54.136502 3316 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/fd371944796ececeefd41c943771dfcb-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4547-0-0-n-de0c74fc75\" (UID: \"fd371944796ececeefd41c943771dfcb\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-n-de0c74fc75" Jan 14 01:03:54.136617 kubelet[3316]: I0114 01:03:54.136518 3316 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/7089cd88cf737a80dc3e21df3d527fff-ca-certs\") pod \"kube-apiserver-ci-4547-0-0-n-de0c74fc75\" (UID: \"7089cd88cf737a80dc3e21df3d527fff\") " pod="kube-system/kube-apiserver-ci-4547-0-0-n-de0c74fc75" Jan 14 01:03:54.136617 kubelet[3316]: I0114 01:03:54.136532 3316 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/7089cd88cf737a80dc3e21df3d527fff-k8s-certs\") pod \"kube-apiserver-ci-4547-0-0-n-de0c74fc75\" (UID: \"7089cd88cf737a80dc3e21df3d527fff\") " pod="kube-system/kube-apiserver-ci-4547-0-0-n-de0c74fc75" Jan 14 01:03:54.136617 kubelet[3316]: I0114 01:03:54.136546 3316 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/fd371944796ececeefd41c943771dfcb-flexvolume-dir\") pod \"kube-controller-manager-ci-4547-0-0-n-de0c74fc75\" (UID: \"fd371944796ececeefd41c943771dfcb\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-n-de0c74fc75" Jan 14 01:03:54.136617 kubelet[3316]: I0114 01:03:54.136559 3316 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/fd371944796ececeefd41c943771dfcb-k8s-certs\") pod \"kube-controller-manager-ci-4547-0-0-n-de0c74fc75\" (UID: \"fd371944796ececeefd41c943771dfcb\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-n-de0c74fc75" Jan 14 01:03:54.136726 kubelet[3316]: I0114 01:03:54.136577 3316 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/e6140c4732fcf35da7d5f7288737d2e0-kubeconfig\") pod \"kube-scheduler-ci-4547-0-0-n-de0c74fc75\" (UID: \"e6140c4732fcf35da7d5f7288737d2e0\") " pod="kube-system/kube-scheduler-ci-4547-0-0-n-de0c74fc75" Jan 14 01:03:54.136726 kubelet[3316]: I0114 01:03:54.136592 3316 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/7089cd88cf737a80dc3e21df3d527fff-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4547-0-0-n-de0c74fc75\" (UID: \"7089cd88cf737a80dc3e21df3d527fff\") " pod="kube-system/kube-apiserver-ci-4547-0-0-n-de0c74fc75" Jan 14 01:03:54.826194 kubelet[3316]: I0114 01:03:54.826121 3316 apiserver.go:52] "Watching apiserver" Jan 14 01:03:54.836407 kubelet[3316]: I0114 01:03:54.836368 3316 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Jan 14 01:03:54.887065 kubelet[3316]: I0114 01:03:54.887025 3316 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4547-0-0-n-de0c74fc75" Jan 14 01:03:54.887316 kubelet[3316]: I0114 01:03:54.887307 3316 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4547-0-0-n-de0c74fc75" Jan 14 01:03:54.887451 kubelet[3316]: I0114 01:03:54.887381 3316 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4547-0-0-n-de0c74fc75" Jan 14 01:03:54.896517 kubelet[3316]: E0114 01:03:54.896485 3316 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4547-0-0-n-de0c74fc75\" already exists" pod="kube-system/kube-apiserver-ci-4547-0-0-n-de0c74fc75" Jan 14 01:03:54.897687 kubelet[3316]: E0114 01:03:54.897669 3316 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4547-0-0-n-de0c74fc75\" already exists" pod="kube-system/kube-scheduler-ci-4547-0-0-n-de0c74fc75" Jan 14 01:03:54.897926 kubelet[3316]: E0114 01:03:54.897913 3316 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4547-0-0-n-de0c74fc75\" already exists" pod="kube-system/kube-controller-manager-ci-4547-0-0-n-de0c74fc75" Jan 14 01:03:54.917062 kubelet[3316]: I0114 01:03:54.916911 3316 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4547-0-0-n-de0c74fc75" podStartSLOduration=1.9168962139999999 podStartE2EDuration="1.916896214s" podCreationTimestamp="2026-01-14 01:03:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-14 01:03:54.907671017 +0000 UTC m=+1.159334321" watchObservedRunningTime="2026-01-14 01:03:54.916896214 +0000 UTC m=+1.168559499" Jan 14 01:03:54.917226 kubelet[3316]: I0114 01:03:54.917147 3316 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4547-0-0-n-de0c74fc75" podStartSLOduration=1.917123509 podStartE2EDuration="1.917123509s" podCreationTimestamp="2026-01-14 01:03:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-14 01:03:54.916721483 +0000 UTC m=+1.168384792" watchObservedRunningTime="2026-01-14 01:03:54.917123509 +0000 UTC m=+1.168786809" Jan 14 01:03:54.935905 kubelet[3316]: I0114 01:03:54.935534 3316 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4547-0-0-n-de0c74fc75" podStartSLOduration=1.935519284 podStartE2EDuration="1.935519284s" podCreationTimestamp="2026-01-14 01:03:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-14 01:03:54.926407271 +0000 UTC m=+1.178070578" watchObservedRunningTime="2026-01-14 01:03:54.935519284 +0000 UTC m=+1.187182591" Jan 14 01:03:58.854950 kubelet[3316]: I0114 01:03:58.854890 3316 kuberuntime_manager.go:1828] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Jan 14 01:03:58.857323 containerd[1695]: time="2026-01-14T01:03:58.857237663Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Jan 14 01:03:58.858523 kubelet[3316]: I0114 01:03:58.858502 3316 kubelet_network.go:47] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Jan 14 01:03:59.460610 systemd[1]: Created slice kubepods-besteffort-pod1f5107fd_f970_4f7d_b717_90b781f3cab6.slice - libcontainer container kubepods-besteffort-pod1f5107fd_f970_4f7d_b717_90b781f3cab6.slice. Jan 14 01:03:59.463626 kubelet[3316]: I0114 01:03:59.463590 3316 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1f5107fd-f970-4f7d-b717-90b781f3cab6-lib-modules\") pod \"kube-proxy-d4slc\" (UID: \"1f5107fd-f970-4f7d-b717-90b781f3cab6\") " pod="kube-system/kube-proxy-d4slc" Jan 14 01:03:59.463626 kubelet[3316]: I0114 01:03:59.463624 3316 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdxfx\" (UniqueName: \"kubernetes.io/projected/1f5107fd-f970-4f7d-b717-90b781f3cab6-kube-api-access-rdxfx\") pod \"kube-proxy-d4slc\" (UID: \"1f5107fd-f970-4f7d-b717-90b781f3cab6\") " pod="kube-system/kube-proxy-d4slc" Jan 14 01:03:59.463743 kubelet[3316]: I0114 01:03:59.463641 3316 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/1f5107fd-f970-4f7d-b717-90b781f3cab6-kube-proxy\") pod \"kube-proxy-d4slc\" (UID: \"1f5107fd-f970-4f7d-b717-90b781f3cab6\") " pod="kube-system/kube-proxy-d4slc" Jan 14 01:03:59.463743 kubelet[3316]: I0114 01:03:59.463655 3316 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/1f5107fd-f970-4f7d-b717-90b781f3cab6-xtables-lock\") pod \"kube-proxy-d4slc\" (UID: \"1f5107fd-f970-4f7d-b717-90b781f3cab6\") " pod="kube-system/kube-proxy-d4slc" Jan 14 01:03:59.569375 kubelet[3316]: E0114 01:03:59.569342 3316 projected.go:291] Couldn't get configMap kube-system/kube-root-ca.crt: configmap "kube-root-ca.crt" not found Jan 14 01:03:59.569656 kubelet[3316]: E0114 01:03:59.569487 3316 projected.go:196] Error preparing data for projected volume kube-api-access-rdxfx for pod kube-system/kube-proxy-d4slc: configmap "kube-root-ca.crt" not found Jan 14 01:03:59.569656 kubelet[3316]: E0114 01:03:59.569538 3316 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1f5107fd-f970-4f7d-b717-90b781f3cab6-kube-api-access-rdxfx podName:1f5107fd-f970-4f7d-b717-90b781f3cab6 nodeName:}" failed. No retries permitted until 2026-01-14 01:04:00.069521381 +0000 UTC m=+6.321184675 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-rdxfx" (UniqueName: "kubernetes.io/projected/1f5107fd-f970-4f7d-b717-90b781f3cab6-kube-api-access-rdxfx") pod "kube-proxy-d4slc" (UID: "1f5107fd-f970-4f7d-b717-90b781f3cab6") : configmap "kube-root-ca.crt" not found Jan 14 01:04:00.043528 systemd[1]: Created slice kubepods-besteffort-poda1ca7c2f_940a_4676_b81e_01fa07eb9a61.slice - libcontainer container kubepods-besteffort-poda1ca7c2f_940a_4676_b81e_01fa07eb9a61.slice. Jan 14 01:04:00.069056 kubelet[3316]: I0114 01:04:00.069014 3316 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/a1ca7c2f-940a-4676-b81e-01fa07eb9a61-var-lib-calico\") pod \"tigera-operator-65cdcdfd6d-rdwnf\" (UID: \"a1ca7c2f-940a-4676-b81e-01fa07eb9a61\") " pod="tigera-operator/tigera-operator-65cdcdfd6d-rdwnf" Jan 14 01:04:00.069056 kubelet[3316]: I0114 01:04:00.069045 3316 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tckm4\" (UniqueName: \"kubernetes.io/projected/a1ca7c2f-940a-4676-b81e-01fa07eb9a61-kube-api-access-tckm4\") pod \"tigera-operator-65cdcdfd6d-rdwnf\" (UID: \"a1ca7c2f-940a-4676-b81e-01fa07eb9a61\") " pod="tigera-operator/tigera-operator-65cdcdfd6d-rdwnf" Jan 14 01:04:00.350856 containerd[1695]: time="2026-01-14T01:04:00.350755214Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-65cdcdfd6d-rdwnf,Uid:a1ca7c2f-940a-4676-b81e-01fa07eb9a61,Namespace:tigera-operator,Attempt:0,}" Jan 14 01:04:00.368859 containerd[1695]: time="2026-01-14T01:04:00.368772568Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-d4slc,Uid:1f5107fd-f970-4f7d-b717-90b781f3cab6,Namespace:kube-system,Attempt:0,}" Jan 14 01:04:00.377478 containerd[1695]: time="2026-01-14T01:04:00.377422709Z" level=info msg="connecting to shim 337f729c27e19af09305fe11891301cb8a04864a6d943a017f216e4e4aa75f19" address="unix:///run/containerd/s/d2f20eec08247b32b799b7380dd6c7427615b3320d2cba5f839f7f36d4c46bf4" namespace=k8s.io protocol=ttrpc version=3 Jan 14 01:04:00.400519 containerd[1695]: time="2026-01-14T01:04:00.400323731Z" level=info msg="connecting to shim 489a965a8afd48451e1fbe8bf6c775b11a4c5eda5ffd2ae111d0c66a7030ea46" address="unix:///run/containerd/s/bf3e33884428aa81ae7ff0fc10e7be5c3b50baa3f5e4cb95df58bf1ba38f1de9" namespace=k8s.io protocol=ttrpc version=3 Jan 14 01:04:00.406228 systemd[1]: Started cri-containerd-337f729c27e19af09305fe11891301cb8a04864a6d943a017f216e4e4aa75f19.scope - libcontainer container 337f729c27e19af09305fe11891301cb8a04864a6d943a017f216e4e4aa75f19. Jan 14 01:04:00.415000 audit: BPF prog-id=137 op=LOAD Jan 14 01:04:00.417430 kernel: kauditd_printk_skb: 32 callbacks suppressed Jan 14 01:04:00.417477 kernel: audit: type=1334 audit(1768352640.415:453): prog-id=137 op=LOAD Jan 14 01:04:00.418000 audit: BPF prog-id=138 op=LOAD Jan 14 01:04:00.421121 kernel: audit: type=1334 audit(1768352640.418:454): prog-id=138 op=LOAD Jan 14 01:04:00.418000 audit[3384]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=3373 pid=3384 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:00.418000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3333376637323963323765313961663039333035666531313839313330 Jan 14 01:04:00.426075 kernel: audit: type=1300 audit(1768352640.418:454): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=3373 pid=3384 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:00.426120 kernel: audit: type=1327 audit(1768352640.418:454): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3333376637323963323765313961663039333035666531313839313330 Jan 14 01:04:00.418000 audit: BPF prog-id=138 op=UNLOAD Jan 14 01:04:00.429459 kernel: audit: type=1334 audit(1768352640.418:455): prog-id=138 op=UNLOAD Jan 14 01:04:00.418000 audit[3384]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3373 pid=3384 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:00.431057 kernel: audit: type=1300 audit(1768352640.418:455): arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3373 pid=3384 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:00.418000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3333376637323963323765313961663039333035666531313839313330 Jan 14 01:04:00.418000 audit: BPF prog-id=139 op=LOAD Jan 14 01:04:00.438301 kernel: audit: type=1327 audit(1768352640.418:455): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3333376637323963323765313961663039333035666531313839313330 Jan 14 01:04:00.438328 kernel: audit: type=1334 audit(1768352640.418:456): prog-id=139 op=LOAD Jan 14 01:04:00.438452 systemd[1]: Started cri-containerd-489a965a8afd48451e1fbe8bf6c775b11a4c5eda5ffd2ae111d0c66a7030ea46.scope - libcontainer container 489a965a8afd48451e1fbe8bf6c775b11a4c5eda5ffd2ae111d0c66a7030ea46. Jan 14 01:04:00.418000 audit[3384]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=3373 pid=3384 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:00.439584 kernel: audit: type=1300 audit(1768352640.418:456): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=3373 pid=3384 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:00.418000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3333376637323963323765313961663039333035666531313839313330 Jan 14 01:04:00.418000 audit: BPF prog-id=140 op=LOAD Jan 14 01:04:00.446250 kernel: audit: type=1327 audit(1768352640.418:456): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3333376637323963323765313961663039333035666531313839313330 Jan 14 01:04:00.418000 audit[3384]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=3373 pid=3384 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:00.418000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3333376637323963323765313961663039333035666531313839313330 Jan 14 01:04:00.418000 audit: BPF prog-id=140 op=UNLOAD Jan 14 01:04:00.418000 audit[3384]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3373 pid=3384 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:00.418000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3333376637323963323765313961663039333035666531313839313330 Jan 14 01:04:00.418000 audit: BPF prog-id=139 op=UNLOAD Jan 14 01:04:00.418000 audit[3384]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3373 pid=3384 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:00.418000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3333376637323963323765313961663039333035666531313839313330 Jan 14 01:04:00.418000 audit: BPF prog-id=141 op=LOAD Jan 14 01:04:00.418000 audit[3384]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=3373 pid=3384 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:00.418000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3333376637323963323765313961663039333035666531313839313330 Jan 14 01:04:00.459000 audit: BPF prog-id=142 op=LOAD Jan 14 01:04:00.459000 audit: BPF prog-id=143 op=LOAD Jan 14 01:04:00.459000 audit[3417]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=3402 pid=3417 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:00.459000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3438396139363561386166643438343531653166626538626636633737 Jan 14 01:04:00.459000 audit: BPF prog-id=143 op=UNLOAD Jan 14 01:04:00.459000 audit[3417]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3402 pid=3417 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:00.459000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3438396139363561386166643438343531653166626538626636633737 Jan 14 01:04:00.460000 audit: BPF prog-id=144 op=LOAD Jan 14 01:04:00.460000 audit[3417]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=3402 pid=3417 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:00.460000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3438396139363561386166643438343531653166626538626636633737 Jan 14 01:04:00.460000 audit: BPF prog-id=145 op=LOAD Jan 14 01:04:00.460000 audit[3417]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=3402 pid=3417 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:00.460000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3438396139363561386166643438343531653166626538626636633737 Jan 14 01:04:00.460000 audit: BPF prog-id=145 op=UNLOAD Jan 14 01:04:00.460000 audit[3417]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3402 pid=3417 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:00.460000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3438396139363561386166643438343531653166626538626636633737 Jan 14 01:04:00.460000 audit: BPF prog-id=144 op=UNLOAD Jan 14 01:04:00.460000 audit[3417]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3402 pid=3417 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:00.460000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3438396139363561386166643438343531653166626538626636633737 Jan 14 01:04:00.460000 audit: BPF prog-id=146 op=LOAD Jan 14 01:04:00.460000 audit[3417]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=3402 pid=3417 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:00.460000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3438396139363561386166643438343531653166626538626636633737 Jan 14 01:04:00.470296 containerd[1695]: time="2026-01-14T01:04:00.470267080Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-65cdcdfd6d-rdwnf,Uid:a1ca7c2f-940a-4676-b81e-01fa07eb9a61,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"337f729c27e19af09305fe11891301cb8a04864a6d943a017f216e4e4aa75f19\"" Jan 14 01:04:00.472283 containerd[1695]: time="2026-01-14T01:04:00.472135829Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\"" Jan 14 01:04:00.477074 containerd[1695]: time="2026-01-14T01:04:00.477031576Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-d4slc,Uid:1f5107fd-f970-4f7d-b717-90b781f3cab6,Namespace:kube-system,Attempt:0,} returns sandbox id \"489a965a8afd48451e1fbe8bf6c775b11a4c5eda5ffd2ae111d0c66a7030ea46\"" Jan 14 01:04:00.482979 containerd[1695]: time="2026-01-14T01:04:00.482948072Z" level=info msg="CreateContainer within sandbox \"489a965a8afd48451e1fbe8bf6c775b11a4c5eda5ffd2ae111d0c66a7030ea46\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Jan 14 01:04:00.493633 containerd[1695]: time="2026-01-14T01:04:00.493588635Z" level=info msg="Container 5a874574a1e8700d55742af43090a7d89478dcf2578a6370f44842ab6baa1136: CDI devices from CRI Config.CDIDevices: []" Jan 14 01:04:00.502515 containerd[1695]: time="2026-01-14T01:04:00.502483419Z" level=info msg="CreateContainer within sandbox \"489a965a8afd48451e1fbe8bf6c775b11a4c5eda5ffd2ae111d0c66a7030ea46\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"5a874574a1e8700d55742af43090a7d89478dcf2578a6370f44842ab6baa1136\"" Jan 14 01:04:00.504235 containerd[1695]: time="2026-01-14T01:04:00.504203340Z" level=info msg="StartContainer for \"5a874574a1e8700d55742af43090a7d89478dcf2578a6370f44842ab6baa1136\"" Jan 14 01:04:00.506289 containerd[1695]: time="2026-01-14T01:04:00.506266996Z" level=info msg="connecting to shim 5a874574a1e8700d55742af43090a7d89478dcf2578a6370f44842ab6baa1136" address="unix:///run/containerd/s/bf3e33884428aa81ae7ff0fc10e7be5c3b50baa3f5e4cb95df58bf1ba38f1de9" protocol=ttrpc version=3 Jan 14 01:04:00.525252 systemd[1]: Started cri-containerd-5a874574a1e8700d55742af43090a7d89478dcf2578a6370f44842ab6baa1136.scope - libcontainer container 5a874574a1e8700d55742af43090a7d89478dcf2578a6370f44842ab6baa1136. Jan 14 01:04:00.582000 audit: BPF prog-id=147 op=LOAD Jan 14 01:04:00.582000 audit[3455]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=3402 pid=3455 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:00.582000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3561383734353734613165383730306435353734326166343330393061 Jan 14 01:04:00.583000 audit: BPF prog-id=148 op=LOAD Jan 14 01:04:00.583000 audit[3455]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=3402 pid=3455 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:00.583000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3561383734353734613165383730306435353734326166343330393061 Jan 14 01:04:00.583000 audit: BPF prog-id=148 op=UNLOAD Jan 14 01:04:00.583000 audit[3455]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3402 pid=3455 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:00.583000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3561383734353734613165383730306435353734326166343330393061 Jan 14 01:04:00.583000 audit: BPF prog-id=147 op=UNLOAD Jan 14 01:04:00.583000 audit[3455]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3402 pid=3455 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:00.583000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3561383734353734613165383730306435353734326166343330393061 Jan 14 01:04:00.583000 audit: BPF prog-id=149 op=LOAD Jan 14 01:04:00.583000 audit[3455]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=3402 pid=3455 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:00.583000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3561383734353734613165383730306435353734326166343330393061 Jan 14 01:04:00.601460 containerd[1695]: time="2026-01-14T01:04:00.601305808Z" level=info msg="StartContainer for \"5a874574a1e8700d55742af43090a7d89478dcf2578a6370f44842ab6baa1136\" returns successfully" Jan 14 01:04:00.815000 audit[3518]: NETFILTER_CFG table=mangle:54 family=2 entries=1 op=nft_register_chain pid=3518 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:04:00.815000 audit[3518]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffcb503bc20 a2=0 a3=7ffcb503bc0c items=0 ppid=3467 pid=3518 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:00.815000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Jan 14 01:04:00.816000 audit[3521]: NETFILTER_CFG table=nat:55 family=2 entries=1 op=nft_register_chain pid=3521 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:04:00.816000 audit[3521]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffe65428650 a2=0 a3=7ffe6542863c items=0 ppid=3467 pid=3521 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:00.816000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Jan 14 01:04:00.816000 audit[3519]: NETFILTER_CFG table=mangle:56 family=10 entries=1 op=nft_register_chain pid=3519 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:04:00.816000 audit[3519]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffc10e8f530 a2=0 a3=7ffc10e8f51c items=0 ppid=3467 pid=3519 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:00.816000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Jan 14 01:04:00.818000 audit[3522]: NETFILTER_CFG table=filter:57 family=2 entries=1 op=nft_register_chain pid=3522 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:04:00.818000 audit[3522]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffd1db150d0 a2=0 a3=7ffd1db150bc items=0 ppid=3467 pid=3522 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:00.818000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Jan 14 01:04:00.819000 audit[3523]: NETFILTER_CFG table=nat:58 family=10 entries=1 op=nft_register_chain pid=3523 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:04:00.819000 audit[3523]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffe68ffe440 a2=0 a3=7ffe68ffe42c items=0 ppid=3467 pid=3523 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:00.819000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Jan 14 01:04:00.820000 audit[3525]: NETFILTER_CFG table=filter:59 family=10 entries=1 op=nft_register_chain pid=3525 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:04:00.820000 audit[3525]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffc51fd36e0 a2=0 a3=7ffc51fd36cc items=0 ppid=3467 pid=3525 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:00.820000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Jan 14 01:04:00.922000 audit[3527]: NETFILTER_CFG table=filter:60 family=2 entries=1 op=nft_register_chain pid=3527 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:04:00.922000 audit[3527]: SYSCALL arch=c000003e syscall=46 success=yes exit=108 a0=3 a1=7ffe008663b0 a2=0 a3=7ffe0086639c items=0 ppid=3467 pid=3527 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:00.922000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Jan 14 01:04:00.925000 audit[3529]: NETFILTER_CFG table=filter:61 family=2 entries=1 op=nft_register_rule pid=3529 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:04:00.925000 audit[3529]: SYSCALL arch=c000003e syscall=46 success=yes exit=752 a0=3 a1=7ffde28a55f0 a2=0 a3=7ffde28a55dc items=0 ppid=3467 pid=3529 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:00.925000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C65207365727669636520706F7274616C73002D Jan 14 01:04:00.929000 audit[3532]: NETFILTER_CFG table=filter:62 family=2 entries=1 op=nft_register_rule pid=3532 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:04:00.929000 audit[3532]: SYSCALL arch=c000003e syscall=46 success=yes exit=752 a0=3 a1=7ffc24651be0 a2=0 a3=7ffc24651bcc items=0 ppid=3467 pid=3532 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:00.929000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C65207365727669636520706F7274616C73 Jan 14 01:04:00.930000 audit[3533]: NETFILTER_CFG table=filter:63 family=2 entries=1 op=nft_register_chain pid=3533 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:04:00.930000 audit[3533]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffda4f4ae30 a2=0 a3=7ffda4f4ae1c items=0 ppid=3467 pid=3533 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:00.930000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Jan 14 01:04:00.932000 audit[3535]: NETFILTER_CFG table=filter:64 family=2 entries=1 op=nft_register_rule pid=3535 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:04:00.932000 audit[3535]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffd3ce21ee0 a2=0 a3=7ffd3ce21ecc items=0 ppid=3467 pid=3535 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:00.932000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Jan 14 01:04:00.933000 audit[3536]: NETFILTER_CFG table=filter:65 family=2 entries=1 op=nft_register_chain pid=3536 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:04:00.933000 audit[3536]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffc91e74f80 a2=0 a3=7ffc91e74f6c items=0 ppid=3467 pid=3536 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:00.933000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D5345525649434553002D740066696C746572 Jan 14 01:04:00.936000 audit[3538]: NETFILTER_CFG table=filter:66 family=2 entries=1 op=nft_register_rule pid=3538 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:04:00.936000 audit[3538]: SYSCALL arch=c000003e syscall=46 success=yes exit=744 a0=3 a1=7ffc3a2429f0 a2=0 a3=7ffc3a2429dc items=0 ppid=3467 pid=3538 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:00.936000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 14 01:04:00.939000 audit[3541]: NETFILTER_CFG table=filter:67 family=2 entries=1 op=nft_register_rule pid=3541 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:04:00.939000 audit[3541]: SYSCALL arch=c000003e syscall=46 success=yes exit=744 a0=3 a1=7ffebc9ea340 a2=0 a3=7ffebc9ea32c items=0 ppid=3467 pid=3541 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:00.939000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 14 01:04:00.940000 audit[3542]: NETFILTER_CFG table=filter:68 family=2 entries=1 op=nft_register_chain pid=3542 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:04:00.940000 audit[3542]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffffdfff4e0 a2=0 a3=7ffffdfff4cc items=0 ppid=3467 pid=3542 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:00.940000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D464F5257415244002D740066696C746572 Jan 14 01:04:00.942000 audit[3544]: NETFILTER_CFG table=filter:69 family=2 entries=1 op=nft_register_rule pid=3544 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:04:00.942000 audit[3544]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7fff5535d870 a2=0 a3=7fff5535d85c items=0 ppid=3467 pid=3544 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:00.942000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Jan 14 01:04:00.943000 audit[3545]: NETFILTER_CFG table=filter:70 family=2 entries=1 op=nft_register_chain pid=3545 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:04:00.943000 audit[3545]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffc6ce18f50 a2=0 a3=7ffc6ce18f3c items=0 ppid=3467 pid=3545 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:00.943000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Jan 14 01:04:00.945000 audit[3547]: NETFILTER_CFG table=filter:71 family=2 entries=1 op=nft_register_rule pid=3547 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:04:00.945000 audit[3547]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7fff23b3b610 a2=0 a3=7fff23b3b5fc items=0 ppid=3467 pid=3547 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:00.945000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A004B5542452D50524F5859 Jan 14 01:04:00.949000 audit[3550]: NETFILTER_CFG table=filter:72 family=2 entries=1 op=nft_register_rule pid=3550 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:04:00.949000 audit[3550]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffe0f17a580 a2=0 a3=7ffe0f17a56c items=0 ppid=3467 pid=3550 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:00.949000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A004B5542452D50524F58 Jan 14 01:04:00.952000 audit[3553]: NETFILTER_CFG table=filter:73 family=2 entries=1 op=nft_register_rule pid=3553 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:04:00.952000 audit[3553]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffd5cbfeaf0 a2=0 a3=7ffd5cbfeadc items=0 ppid=3467 pid=3553 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:00.952000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A004B5542452D50524F Jan 14 01:04:00.953000 audit[3554]: NETFILTER_CFG table=nat:74 family=2 entries=1 op=nft_register_chain pid=3554 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:04:00.953000 audit[3554]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffcebbc8e10 a2=0 a3=7ffcebbc8dfc items=0 ppid=3467 pid=3554 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:00.953000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D5345525649434553002D74006E6174 Jan 14 01:04:00.955000 audit[3556]: NETFILTER_CFG table=nat:75 family=2 entries=1 op=nft_register_rule pid=3556 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:04:00.955000 audit[3556]: SYSCALL arch=c000003e syscall=46 success=yes exit=524 a0=3 a1=7ffe4e68b4a0 a2=0 a3=7ffe4e68b48c items=0 ppid=3467 pid=3556 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:00.955000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 14 01:04:00.958000 audit[3559]: NETFILTER_CFG table=nat:76 family=2 entries=1 op=nft_register_rule pid=3559 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:04:00.958000 audit[3559]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffe58d666e0 a2=0 a3=7ffe58d666cc items=0 ppid=3467 pid=3559 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:00.958000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 14 01:04:00.959000 audit[3560]: NETFILTER_CFG table=nat:77 family=2 entries=1 op=nft_register_chain pid=3560 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:04:00.959000 audit[3560]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffc32c6fe80 a2=0 a3=7ffc32c6fe6c items=0 ppid=3467 pid=3560 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:00.959000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Jan 14 01:04:00.961000 audit[3562]: NETFILTER_CFG table=nat:78 family=2 entries=1 op=nft_register_rule pid=3562 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:04:00.961000 audit[3562]: SYSCALL arch=c000003e syscall=46 success=yes exit=532 a0=3 a1=7ffc04603070 a2=0 a3=7ffc0460305c items=0 ppid=3467 pid=3562 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:00.961000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Jan 14 01:04:00.980000 audit[3568]: NETFILTER_CFG table=filter:79 family=2 entries=8 op=nft_register_rule pid=3568 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:04:00.980000 audit[3568]: SYSCALL arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7fff2a43bda0 a2=0 a3=7fff2a43bd8c items=0 ppid=3467 pid=3568 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:00.980000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:04:00.989000 audit[3568]: NETFILTER_CFG table=nat:80 family=2 entries=14 op=nft_register_chain pid=3568 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:04:00.989000 audit[3568]: SYSCALL arch=c000003e syscall=46 success=yes exit=5508 a0=3 a1=7fff2a43bda0 a2=0 a3=7fff2a43bd8c items=0 ppid=3467 pid=3568 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:00.989000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:04:00.991000 audit[3573]: NETFILTER_CFG table=filter:81 family=10 entries=1 op=nft_register_chain pid=3573 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:04:00.991000 audit[3573]: SYSCALL arch=c000003e syscall=46 success=yes exit=108 a0=3 a1=7ffe67a58be0 a2=0 a3=7ffe67a58bcc items=0 ppid=3467 pid=3573 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:00.991000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Jan 14 01:04:00.993000 audit[3575]: NETFILTER_CFG table=filter:82 family=10 entries=2 op=nft_register_chain pid=3575 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:04:00.993000 audit[3575]: SYSCALL arch=c000003e syscall=46 success=yes exit=836 a0=3 a1=7ffc2cb92cf0 a2=0 a3=7ffc2cb92cdc items=0 ppid=3467 pid=3575 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:00.993000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C65207365727669636520706F7274616C73 Jan 14 01:04:00.996000 audit[3578]: NETFILTER_CFG table=filter:83 family=10 entries=1 op=nft_register_rule pid=3578 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:04:00.996000 audit[3578]: SYSCALL arch=c000003e syscall=46 success=yes exit=752 a0=3 a1=7fff01e9e080 a2=0 a3=7fff01e9e06c items=0 ppid=3467 pid=3578 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:00.996000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C65207365727669636520706F7274616C Jan 14 01:04:00.997000 audit[3579]: NETFILTER_CFG table=filter:84 family=10 entries=1 op=nft_register_chain pid=3579 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:04:00.997000 audit[3579]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffd108f2550 a2=0 a3=7ffd108f253c items=0 ppid=3467 pid=3579 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:00.997000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Jan 14 01:04:00.999000 audit[3581]: NETFILTER_CFG table=filter:85 family=10 entries=1 op=nft_register_rule pid=3581 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:04:00.999000 audit[3581]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffc25e2e950 a2=0 a3=7ffc25e2e93c items=0 ppid=3467 pid=3581 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:00.999000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Jan 14 01:04:01.000000 audit[3582]: NETFILTER_CFG table=filter:86 family=10 entries=1 op=nft_register_chain pid=3582 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:04:01.000000 audit[3582]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffc2ed39fc0 a2=0 a3=7ffc2ed39fac items=0 ppid=3467 pid=3582 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:01.000000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D5345525649434553002D740066696C746572 Jan 14 01:04:01.003000 audit[3584]: NETFILTER_CFG table=filter:87 family=10 entries=1 op=nft_register_rule pid=3584 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:04:01.003000 audit[3584]: SYSCALL arch=c000003e syscall=46 success=yes exit=744 a0=3 a1=7fff259cc450 a2=0 a3=7fff259cc43c items=0 ppid=3467 pid=3584 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:01.003000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 14 01:04:01.007000 audit[3587]: NETFILTER_CFG table=filter:88 family=10 entries=2 op=nft_register_chain pid=3587 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:04:01.007000 audit[3587]: SYSCALL arch=c000003e syscall=46 success=yes exit=828 a0=3 a1=7ffc16318f50 a2=0 a3=7ffc16318f3c items=0 ppid=3467 pid=3587 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:01.007000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 14 01:04:01.008000 audit[3588]: NETFILTER_CFG table=filter:89 family=10 entries=1 op=nft_register_chain pid=3588 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:04:01.008000 audit[3588]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fff35073940 a2=0 a3=7fff3507392c items=0 ppid=3467 pid=3588 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:01.008000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D464F5257415244002D740066696C746572 Jan 14 01:04:01.011000 audit[3590]: NETFILTER_CFG table=filter:90 family=10 entries=1 op=nft_register_rule pid=3590 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:04:01.011000 audit[3590]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffeb67f8360 a2=0 a3=7ffeb67f834c items=0 ppid=3467 pid=3590 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:01.011000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Jan 14 01:04:01.012000 audit[3591]: NETFILTER_CFG table=filter:91 family=10 entries=1 op=nft_register_chain pid=3591 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:04:01.012000 audit[3591]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffecdffd830 a2=0 a3=7ffecdffd81c items=0 ppid=3467 pid=3591 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:01.012000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Jan 14 01:04:01.014000 audit[3593]: NETFILTER_CFG table=filter:92 family=10 entries=1 op=nft_register_rule pid=3593 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:04:01.014000 audit[3593]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7fffb99e70e0 a2=0 a3=7fffb99e70cc items=0 ppid=3467 pid=3593 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:01.014000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A004B5542452D50524F58 Jan 14 01:04:01.018000 audit[3596]: NETFILTER_CFG table=filter:93 family=10 entries=1 op=nft_register_rule pid=3596 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:04:01.018000 audit[3596]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffe94f6dcf0 a2=0 a3=7ffe94f6dcdc items=0 ppid=3467 pid=3596 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:01.018000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A004B5542452D50524F Jan 14 01:04:01.023000 audit[3599]: NETFILTER_CFG table=filter:94 family=10 entries=1 op=nft_register_rule pid=3599 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:04:01.023000 audit[3599]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffd85eccb00 a2=0 a3=7ffd85eccaec items=0 ppid=3467 pid=3599 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:01.023000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A004B5542452D5052 Jan 14 01:04:01.024000 audit[3600]: NETFILTER_CFG table=nat:95 family=10 entries=1 op=nft_register_chain pid=3600 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:04:01.024000 audit[3600]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffd1cc9f570 a2=0 a3=7ffd1cc9f55c items=0 ppid=3467 pid=3600 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:01.024000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D5345525649434553002D74006E6174 Jan 14 01:04:01.027000 audit[3602]: NETFILTER_CFG table=nat:96 family=10 entries=1 op=nft_register_rule pid=3602 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:04:01.027000 audit[3602]: SYSCALL arch=c000003e syscall=46 success=yes exit=524 a0=3 a1=7ffeb26d85e0 a2=0 a3=7ffeb26d85cc items=0 ppid=3467 pid=3602 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:01.027000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 14 01:04:01.031000 audit[3605]: NETFILTER_CFG table=nat:97 family=10 entries=1 op=nft_register_rule pid=3605 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:04:01.031000 audit[3605]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffc76b229c0 a2=0 a3=7ffc76b229ac items=0 ppid=3467 pid=3605 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:01.031000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 14 01:04:01.032000 audit[3606]: NETFILTER_CFG table=nat:98 family=10 entries=1 op=nft_register_chain pid=3606 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:04:01.032000 audit[3606]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffdb7a77470 a2=0 a3=7ffdb7a7745c items=0 ppid=3467 pid=3606 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:01.032000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Jan 14 01:04:01.034000 audit[3608]: NETFILTER_CFG table=nat:99 family=10 entries=2 op=nft_register_chain pid=3608 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:04:01.034000 audit[3608]: SYSCALL arch=c000003e syscall=46 success=yes exit=612 a0=3 a1=7ffc474f3390 a2=0 a3=7ffc474f337c items=0 ppid=3467 pid=3608 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:01.034000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Jan 14 01:04:01.035000 audit[3609]: NETFILTER_CFG table=filter:100 family=10 entries=1 op=nft_register_chain pid=3609 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:04:01.035000 audit[3609]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffd76e4e220 a2=0 a3=7ffd76e4e20c items=0 ppid=3467 pid=3609 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:01.035000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D4649524557414C4C002D740066696C746572 Jan 14 01:04:01.037000 audit[3611]: NETFILTER_CFG table=filter:101 family=10 entries=1 op=nft_register_rule pid=3611 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:04:01.037000 audit[3611]: SYSCALL arch=c000003e syscall=46 success=yes exit=228 a0=3 a1=7ffde2cb8e40 a2=0 a3=7ffde2cb8e2c items=0 ppid=3467 pid=3611 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:01.037000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 14 01:04:01.041000 audit[3614]: NETFILTER_CFG table=filter:102 family=10 entries=1 op=nft_register_rule pid=3614 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:04:01.041000 audit[3614]: SYSCALL arch=c000003e syscall=46 success=yes exit=228 a0=3 a1=7fffe160a6b0 a2=0 a3=7fffe160a69c items=0 ppid=3467 pid=3614 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:01.041000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 14 01:04:01.044000 audit[3616]: NETFILTER_CFG table=filter:103 family=10 entries=3 op=nft_register_rule pid=3616 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Jan 14 01:04:01.044000 audit[3616]: SYSCALL arch=c000003e syscall=46 success=yes exit=2088 a0=3 a1=7fffd9b40ac0 a2=0 a3=7fffd9b40aac items=0 ppid=3467 pid=3616 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:01.044000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:04:01.044000 audit[3616]: NETFILTER_CFG table=nat:104 family=10 entries=7 op=nft_register_chain pid=3616 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Jan 14 01:04:01.044000 audit[3616]: SYSCALL arch=c000003e syscall=46 success=yes exit=2056 a0=3 a1=7fffd9b40ac0 a2=0 a3=7fffd9b40aac items=0 ppid=3467 pid=3616 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:01.044000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:04:02.433385 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount41858449.mount: Deactivated successfully. Jan 14 01:04:02.852619 containerd[1695]: time="2026-01-14T01:04:02.852567692Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:04:02.854499 containerd[1695]: time="2026-01-14T01:04:02.854353999Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.7: active requests=0, bytes read=23558205" Jan 14 01:04:02.856173 containerd[1695]: time="2026-01-14T01:04:02.856152483Z" level=info msg="ImageCreate event name:\"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:04:02.859644 containerd[1695]: time="2026-01-14T01:04:02.859622564Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:04:02.860129 containerd[1695]: time="2026-01-14T01:04:02.860106007Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.7\" with image id \"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\", repo tag \"quay.io/tigera/operator:v1.38.7\", repo digest \"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\", size \"25057686\" in 2.38794449s" Jan 14 01:04:02.860187 containerd[1695]: time="2026-01-14T01:04:02.860177248Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\" returns image reference \"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\"" Jan 14 01:04:02.864650 containerd[1695]: time="2026-01-14T01:04:02.864621449Z" level=info msg="CreateContainer within sandbox \"337f729c27e19af09305fe11891301cb8a04864a6d943a017f216e4e4aa75f19\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Jan 14 01:04:02.877070 containerd[1695]: time="2026-01-14T01:04:02.877028662Z" level=info msg="Container 2b216497486f9bd0d6a49a0af7a082b924c0ad66cf99ae5b86d17b3bb20e65cc: CDI devices from CRI Config.CDIDevices: []" Jan 14 01:04:02.884670 containerd[1695]: time="2026-01-14T01:04:02.884635172Z" level=info msg="CreateContainer within sandbox \"337f729c27e19af09305fe11891301cb8a04864a6d943a017f216e4e4aa75f19\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"2b216497486f9bd0d6a49a0af7a082b924c0ad66cf99ae5b86d17b3bb20e65cc\"" Jan 14 01:04:02.885357 containerd[1695]: time="2026-01-14T01:04:02.885172247Z" level=info msg="StartContainer for \"2b216497486f9bd0d6a49a0af7a082b924c0ad66cf99ae5b86d17b3bb20e65cc\"" Jan 14 01:04:02.886037 containerd[1695]: time="2026-01-14T01:04:02.886012263Z" level=info msg="connecting to shim 2b216497486f9bd0d6a49a0af7a082b924c0ad66cf99ae5b86d17b3bb20e65cc" address="unix:///run/containerd/s/d2f20eec08247b32b799b7380dd6c7427615b3320d2cba5f839f7f36d4c46bf4" protocol=ttrpc version=3 Jan 14 01:04:02.904329 systemd[1]: Started cri-containerd-2b216497486f9bd0d6a49a0af7a082b924c0ad66cf99ae5b86d17b3bb20e65cc.scope - libcontainer container 2b216497486f9bd0d6a49a0af7a082b924c0ad66cf99ae5b86d17b3bb20e65cc. Jan 14 01:04:02.912000 audit: BPF prog-id=150 op=LOAD Jan 14 01:04:02.912000 audit: BPF prog-id=151 op=LOAD Jan 14 01:04:02.912000 audit[3625]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00010c238 a2=98 a3=0 items=0 ppid=3373 pid=3625 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:02.912000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3262323136343937343836663962643064366134396130616637613038 Jan 14 01:04:02.912000 audit: BPF prog-id=151 op=UNLOAD Jan 14 01:04:02.912000 audit[3625]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3373 pid=3625 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:02.912000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3262323136343937343836663962643064366134396130616637613038 Jan 14 01:04:02.912000 audit: BPF prog-id=152 op=LOAD Jan 14 01:04:02.912000 audit[3625]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00010c488 a2=98 a3=0 items=0 ppid=3373 pid=3625 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:02.912000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3262323136343937343836663962643064366134396130616637613038 Jan 14 01:04:02.912000 audit: BPF prog-id=153 op=LOAD Jan 14 01:04:02.912000 audit[3625]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00010c218 a2=98 a3=0 items=0 ppid=3373 pid=3625 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:02.912000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3262323136343937343836663962643064366134396130616637613038 Jan 14 01:04:02.912000 audit: BPF prog-id=153 op=UNLOAD Jan 14 01:04:02.912000 audit[3625]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3373 pid=3625 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:02.912000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3262323136343937343836663962643064366134396130616637613038 Jan 14 01:04:02.912000 audit: BPF prog-id=152 op=UNLOAD Jan 14 01:04:02.912000 audit[3625]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3373 pid=3625 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:02.912000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3262323136343937343836663962643064366134396130616637613038 Jan 14 01:04:02.912000 audit: BPF prog-id=154 op=LOAD Jan 14 01:04:02.912000 audit[3625]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00010c6e8 a2=98 a3=0 items=0 ppid=3373 pid=3625 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:02.912000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3262323136343937343836663962643064366134396130616637613038 Jan 14 01:04:02.930476 containerd[1695]: time="2026-01-14T01:04:02.930430599Z" level=info msg="StartContainer for \"2b216497486f9bd0d6a49a0af7a082b924c0ad66cf99ae5b86d17b3bb20e65cc\" returns successfully" Jan 14 01:04:03.920715 kubelet[3316]: I0114 01:04:03.920664 3316 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-d4slc" podStartSLOduration=4.920647999 podStartE2EDuration="4.920647999s" podCreationTimestamp="2026-01-14 01:03:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-14 01:04:00.908091984 +0000 UTC m=+7.159755286" watchObservedRunningTime="2026-01-14 01:04:03.920647999 +0000 UTC m=+10.172311285" Jan 14 01:04:03.921557 kubelet[3316]: I0114 01:04:03.921165 3316 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-65cdcdfd6d-rdwnf" podStartSLOduration=2.532223654 podStartE2EDuration="4.921155934s" podCreationTimestamp="2026-01-14 01:03:59 +0000 UTC" firstStartedPulling="2026-01-14 01:04:00.471824955 +0000 UTC m=+6.723488241" lastFinishedPulling="2026-01-14 01:04:02.860757234 +0000 UTC m=+9.112420521" observedRunningTime="2026-01-14 01:04:03.919891573 +0000 UTC m=+10.171554858" watchObservedRunningTime="2026-01-14 01:04:03.921155934 +0000 UTC m=+10.172819232" Jan 14 01:04:08.338135 sudo[2380]: pam_unix(sudo:session): session closed for user root Jan 14 01:04:08.342816 kernel: kauditd_printk_skb: 224 callbacks suppressed Jan 14 01:04:08.342916 kernel: audit: type=1106 audit(1768352648.337:533): pid=2380 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 14 01:04:08.337000 audit[2380]: USER_END pid=2380 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 14 01:04:08.337000 audit[2380]: CRED_DISP pid=2380 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 14 01:04:08.348069 kernel: audit: type=1104 audit(1768352648.337:534): pid=2380 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 14 01:04:08.437144 sshd[2379]: Connection closed by 4.153.228.146 port 39540 Jan 14 01:04:08.439189 sshd-session[2375]: pam_unix(sshd:session): session closed for user core Jan 14 01:04:08.439000 audit[2375]: USER_END pid=2375 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:04:08.447065 kernel: audit: type=1106 audit(1768352648.439:535): pid=2375 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:04:08.445699 systemd[1]: sshd@6-10.0.21.32:22-4.153.228.146:39540.service: Deactivated successfully. Jan 14 01:04:08.447981 systemd[1]: session-8.scope: Deactivated successfully. Jan 14 01:04:08.448181 systemd[1]: session-8.scope: Consumed 4.686s CPU time, 232.3M memory peak. Jan 14 01:04:08.449655 systemd-logind[1667]: Session 8 logged out. Waiting for processes to exit. Jan 14 01:04:08.439000 audit[2375]: CRED_DISP pid=2375 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:04:08.455073 kernel: audit: type=1104 audit(1768352648.439:536): pid=2375 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:04:08.457001 systemd-logind[1667]: Removed session 8. Jan 14 01:04:08.444000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@6-10.0.21.32:22-4.153.228.146:39540 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:04:08.462495 kernel: audit: type=1131 audit(1768352648.444:537): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@6-10.0.21.32:22-4.153.228.146:39540 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:04:09.271000 audit[3703]: NETFILTER_CFG table=filter:105 family=2 entries=15 op=nft_register_rule pid=3703 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:04:09.276068 kernel: audit: type=1325 audit(1768352649.271:538): table=filter:105 family=2 entries=15 op=nft_register_rule pid=3703 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:04:09.271000 audit[3703]: SYSCALL arch=c000003e syscall=46 success=yes exit=5992 a0=3 a1=7ffcd288f590 a2=0 a3=7ffcd288f57c items=0 ppid=3467 pid=3703 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:09.283070 kernel: audit: type=1300 audit(1768352649.271:538): arch=c000003e syscall=46 success=yes exit=5992 a0=3 a1=7ffcd288f590 a2=0 a3=7ffcd288f57c items=0 ppid=3467 pid=3703 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:09.271000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:04:09.276000 audit[3703]: NETFILTER_CFG table=nat:106 family=2 entries=12 op=nft_register_rule pid=3703 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:04:09.286967 kernel: audit: type=1327 audit(1768352649.271:538): proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:04:09.287016 kernel: audit: type=1325 audit(1768352649.276:539): table=nat:106 family=2 entries=12 op=nft_register_rule pid=3703 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:04:09.276000 audit[3703]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffcd288f590 a2=0 a3=0 items=0 ppid=3467 pid=3703 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:09.295155 kernel: audit: type=1300 audit(1768352649.276:539): arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffcd288f590 a2=0 a3=0 items=0 ppid=3467 pid=3703 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:09.276000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:04:10.327000 audit[3705]: NETFILTER_CFG table=filter:107 family=2 entries=16 op=nft_register_rule pid=3705 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:04:10.327000 audit[3705]: SYSCALL arch=c000003e syscall=46 success=yes exit=5992 a0=3 a1=7ffd7cba0930 a2=0 a3=7ffd7cba091c items=0 ppid=3467 pid=3705 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:10.327000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:04:10.342000 audit[3705]: NETFILTER_CFG table=nat:108 family=2 entries=12 op=nft_register_rule pid=3705 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:04:10.342000 audit[3705]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffd7cba0930 a2=0 a3=0 items=0 ppid=3467 pid=3705 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:10.342000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:04:11.374000 audit[3707]: NETFILTER_CFG table=filter:109 family=2 entries=18 op=nft_register_rule pid=3707 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:04:11.374000 audit[3707]: SYSCALL arch=c000003e syscall=46 success=yes exit=7480 a0=3 a1=7ffcc4c427a0 a2=0 a3=7ffcc4c4278c items=0 ppid=3467 pid=3707 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:11.374000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:04:11.379000 audit[3707]: NETFILTER_CFG table=nat:110 family=2 entries=12 op=nft_register_rule pid=3707 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:04:11.379000 audit[3707]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffcc4c427a0 a2=0 a3=0 items=0 ppid=3467 pid=3707 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:11.379000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:04:12.643000 audit[3709]: NETFILTER_CFG table=filter:111 family=2 entries=21 op=nft_register_rule pid=3709 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:04:12.643000 audit[3709]: SYSCALL arch=c000003e syscall=46 success=yes exit=8224 a0=3 a1=7fff75fc2980 a2=0 a3=7fff75fc296c items=0 ppid=3467 pid=3709 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:12.643000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:04:12.649000 audit[3709]: NETFILTER_CFG table=nat:112 family=2 entries=12 op=nft_register_rule pid=3709 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:04:12.649000 audit[3709]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7fff75fc2980 a2=0 a3=0 items=0 ppid=3467 pid=3709 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:12.649000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:04:12.674727 systemd[1]: Created slice kubepods-besteffort-pod1f0dbfc0_64a7_487a_a734_2ea83d1c3b27.slice - libcontainer container kubepods-besteffort-pod1f0dbfc0_64a7_487a_a734_2ea83d1c3b27.slice. Jan 14 01:04:12.744316 kubelet[3316]: I0114 01:04:12.744283 3316 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/1f0dbfc0-64a7-487a-a734-2ea83d1c3b27-typha-certs\") pod \"calico-typha-5d494f46bb-dwgld\" (UID: \"1f0dbfc0-64a7-487a-a734-2ea83d1c3b27\") " pod="calico-system/calico-typha-5d494f46bb-dwgld" Jan 14 01:04:12.744316 kubelet[3316]: I0114 01:04:12.744317 3316 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f0dbfc0-64a7-487a-a734-2ea83d1c3b27-tigera-ca-bundle\") pod \"calico-typha-5d494f46bb-dwgld\" (UID: \"1f0dbfc0-64a7-487a-a734-2ea83d1c3b27\") " pod="calico-system/calico-typha-5d494f46bb-dwgld" Jan 14 01:04:12.745072 kubelet[3316]: I0114 01:04:12.744333 3316 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhxt5\" (UniqueName: \"kubernetes.io/projected/1f0dbfc0-64a7-487a-a734-2ea83d1c3b27-kube-api-access-nhxt5\") pod \"calico-typha-5d494f46bb-dwgld\" (UID: \"1f0dbfc0-64a7-487a-a734-2ea83d1c3b27\") " pod="calico-system/calico-typha-5d494f46bb-dwgld" Jan 14 01:04:12.865396 systemd[1]: Created slice kubepods-besteffort-pod53c39008_c4eb_4eaf_983f_cc6c5c3edb37.slice - libcontainer container kubepods-besteffort-pod53c39008_c4eb_4eaf_983f_cc6c5c3edb37.slice. Jan 14 01:04:12.945848 kubelet[3316]: I0114 01:04:12.944984 3316 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/53c39008-c4eb-4eaf-983f-cc6c5c3edb37-cni-net-dir\") pod \"calico-node-mw7w8\" (UID: \"53c39008-c4eb-4eaf-983f-cc6c5c3edb37\") " pod="calico-system/calico-node-mw7w8" Jan 14 01:04:12.945848 kubelet[3316]: I0114 01:04:12.945029 3316 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/53c39008-c4eb-4eaf-983f-cc6c5c3edb37-policysync\") pod \"calico-node-mw7w8\" (UID: \"53c39008-c4eb-4eaf-983f-cc6c5c3edb37\") " pod="calico-system/calico-node-mw7w8" Jan 14 01:04:12.945848 kubelet[3316]: I0114 01:04:12.945044 3316 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/53c39008-c4eb-4eaf-983f-cc6c5c3edb37-tigera-ca-bundle\") pod \"calico-node-mw7w8\" (UID: \"53c39008-c4eb-4eaf-983f-cc6c5c3edb37\") " pod="calico-system/calico-node-mw7w8" Jan 14 01:04:12.945848 kubelet[3316]: I0114 01:04:12.945555 3316 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/53c39008-c4eb-4eaf-983f-cc6c5c3edb37-var-run-calico\") pod \"calico-node-mw7w8\" (UID: \"53c39008-c4eb-4eaf-983f-cc6c5c3edb37\") " pod="calico-system/calico-node-mw7w8" Jan 14 01:04:12.945848 kubelet[3316]: I0114 01:04:12.945587 3316 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/53c39008-c4eb-4eaf-983f-cc6c5c3edb37-node-certs\") pod \"calico-node-mw7w8\" (UID: \"53c39008-c4eb-4eaf-983f-cc6c5c3edb37\") " pod="calico-system/calico-node-mw7w8" Jan 14 01:04:12.946109 kubelet[3316]: I0114 01:04:12.945603 3316 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/53c39008-c4eb-4eaf-983f-cc6c5c3edb37-flexvol-driver-host\") pod \"calico-node-mw7w8\" (UID: \"53c39008-c4eb-4eaf-983f-cc6c5c3edb37\") " pod="calico-system/calico-node-mw7w8" Jan 14 01:04:12.946109 kubelet[3316]: I0114 01:04:12.945622 3316 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6t8lc\" (UniqueName: \"kubernetes.io/projected/53c39008-c4eb-4eaf-983f-cc6c5c3edb37-kube-api-access-6t8lc\") pod \"calico-node-mw7w8\" (UID: \"53c39008-c4eb-4eaf-983f-cc6c5c3edb37\") " pod="calico-system/calico-node-mw7w8" Jan 14 01:04:12.946109 kubelet[3316]: I0114 01:04:12.945639 3316 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/53c39008-c4eb-4eaf-983f-cc6c5c3edb37-cni-log-dir\") pod \"calico-node-mw7w8\" (UID: \"53c39008-c4eb-4eaf-983f-cc6c5c3edb37\") " pod="calico-system/calico-node-mw7w8" Jan 14 01:04:12.946109 kubelet[3316]: I0114 01:04:12.945666 3316 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/53c39008-c4eb-4eaf-983f-cc6c5c3edb37-var-lib-calico\") pod \"calico-node-mw7w8\" (UID: \"53c39008-c4eb-4eaf-983f-cc6c5c3edb37\") " pod="calico-system/calico-node-mw7w8" Jan 14 01:04:12.946109 kubelet[3316]: I0114 01:04:12.945684 3316 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/53c39008-c4eb-4eaf-983f-cc6c5c3edb37-cni-bin-dir\") pod \"calico-node-mw7w8\" (UID: \"53c39008-c4eb-4eaf-983f-cc6c5c3edb37\") " pod="calico-system/calico-node-mw7w8" Jan 14 01:04:12.946232 kubelet[3316]: I0114 01:04:12.945705 3316 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/53c39008-c4eb-4eaf-983f-cc6c5c3edb37-lib-modules\") pod \"calico-node-mw7w8\" (UID: \"53c39008-c4eb-4eaf-983f-cc6c5c3edb37\") " pod="calico-system/calico-node-mw7w8" Jan 14 01:04:12.946232 kubelet[3316]: I0114 01:04:12.945724 3316 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/53c39008-c4eb-4eaf-983f-cc6c5c3edb37-xtables-lock\") pod \"calico-node-mw7w8\" (UID: \"53c39008-c4eb-4eaf-983f-cc6c5c3edb37\") " pod="calico-system/calico-node-mw7w8" Jan 14 01:04:12.987571 containerd[1695]: time="2026-01-14T01:04:12.987517834Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5d494f46bb-dwgld,Uid:1f0dbfc0-64a7-487a-a734-2ea83d1c3b27,Namespace:calico-system,Attempt:0,}" Jan 14 01:04:13.021571 containerd[1695]: time="2026-01-14T01:04:13.020657529Z" level=info msg="connecting to shim 8f8b8520b795e6072f24022aa942189bc4b11d9b8b262760089244acd6c6d6ab" address="unix:///run/containerd/s/90822588ff8813df799bdfbae2980ac32117aabe5c4cadd9c620526a1008cfb4" namespace=k8s.io protocol=ttrpc version=3 Jan 14 01:04:13.054016 kubelet[3316]: E0114 01:04:13.053568 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.054016 kubelet[3316]: W0114 01:04:13.053871 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.054016 kubelet[3316]: E0114 01:04:13.053989 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.054425 kubelet[3316]: E0114 01:04:13.054403 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.054425 kubelet[3316]: W0114 01:04:13.054416 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.054527 kubelet[3316]: E0114 01:04:13.054516 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.055284 kubelet[3316]: E0114 01:04:13.054810 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.055284 kubelet[3316]: W0114 01:04:13.054817 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.055284 kubelet[3316]: E0114 01:04:13.054826 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.056240 kubelet[3316]: E0114 01:04:13.055758 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.056240 kubelet[3316]: W0114 01:04:13.055771 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.056240 kubelet[3316]: E0114 01:04:13.055783 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.058496 kubelet[3316]: E0114 01:04:13.056819 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.058496 kubelet[3316]: W0114 01:04:13.056828 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.058496 kubelet[3316]: E0114 01:04:13.056839 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.061022 kubelet[3316]: E0114 01:04:13.061007 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.061022 kubelet[3316]: W0114 01:04:13.061020 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.061115 kubelet[3316]: E0114 01:04:13.061041 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.061340 kubelet[3316]: E0114 01:04:13.061330 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.061340 kubelet[3316]: W0114 01:04:13.061339 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.061401 kubelet[3316]: E0114 01:04:13.061348 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.063292 kubelet[3316]: E0114 01:04:13.063276 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.063292 kubelet[3316]: W0114 01:04:13.063288 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.063370 kubelet[3316]: E0114 01:04:13.063324 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.063546 kubelet[3316]: E0114 01:04:13.063534 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.063546 kubelet[3316]: W0114 01:04:13.063544 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.063597 kubelet[3316]: E0114 01:04:13.063552 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.064900 kubelet[3316]: E0114 01:04:13.064884 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.064900 kubelet[3316]: W0114 01:04:13.064895 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.064967 kubelet[3316]: E0114 01:04:13.064904 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.065393 kubelet[3316]: E0114 01:04:13.065339 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.065393 kubelet[3316]: W0114 01:04:13.065351 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.065576 kubelet[3316]: E0114 01:04:13.065544 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.067877 kubelet[3316]: E0114 01:04:13.067401 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.067877 kubelet[3316]: W0114 01:04:13.067414 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.067877 kubelet[3316]: E0114 01:04:13.067425 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.068361 kubelet[3316]: E0114 01:04:13.068344 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.068361 kubelet[3316]: W0114 01:04:13.068357 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.068497 kubelet[3316]: E0114 01:04:13.068367 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.070464 kubelet[3316]: E0114 01:04:13.070282 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.070586 kubelet[3316]: W0114 01:04:13.070570 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.070615 kubelet[3316]: E0114 01:04:13.070588 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.073103 kubelet[3316]: E0114 01:04:13.071169 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.073103 kubelet[3316]: W0114 01:04:13.071181 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.073103 kubelet[3316]: E0114 01:04:13.071190 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.073103 kubelet[3316]: E0114 01:04:13.071603 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.073103 kubelet[3316]: W0114 01:04:13.071610 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.073103 kubelet[3316]: E0114 01:04:13.071618 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.073103 kubelet[3316]: E0114 01:04:13.071906 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.073103 kubelet[3316]: W0114 01:04:13.071912 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.073103 kubelet[3316]: E0114 01:04:13.071919 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.073103 kubelet[3316]: E0114 01:04:13.072751 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.073336 kubelet[3316]: W0114 01:04:13.072760 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.073336 kubelet[3316]: E0114 01:04:13.072770 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.073336 kubelet[3316]: E0114 01:04:13.073294 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.073336 kubelet[3316]: W0114 01:04:13.073303 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.073336 kubelet[3316]: E0114 01:04:13.073313 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.074028 kubelet[3316]: E0114 01:04:13.074013 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.074028 kubelet[3316]: W0114 01:04:13.074024 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.074115 kubelet[3316]: E0114 01:04:13.074035 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.075535 kubelet[3316]: E0114 01:04:13.075506 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.075776 kubelet[3316]: W0114 01:04:13.075718 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.075776 kubelet[3316]: E0114 01:04:13.075734 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.077063 kubelet[3316]: E0114 01:04:13.076963 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.077063 kubelet[3316]: W0114 01:04:13.076974 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.077063 kubelet[3316]: E0114 01:04:13.076984 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.077943 kubelet[3316]: E0114 01:04:13.077889 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.077943 kubelet[3316]: W0114 01:04:13.077901 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.077943 kubelet[3316]: E0114 01:04:13.077911 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.078236 systemd[1]: Started cri-containerd-8f8b8520b795e6072f24022aa942189bc4b11d9b8b262760089244acd6c6d6ab.scope - libcontainer container 8f8b8520b795e6072f24022aa942189bc4b11d9b8b262760089244acd6c6d6ab. Jan 14 01:04:13.080600 kubelet[3316]: E0114 01:04:13.080573 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-gdh9l" podUID="4395ad87-346f-47f3-8e06-f63944f13a5d" Jan 14 01:04:13.095000 audit: BPF prog-id=155 op=LOAD Jan 14 01:04:13.096000 audit: BPF prog-id=156 op=LOAD Jan 14 01:04:13.096000 audit[3731]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000106238 a2=98 a3=0 items=0 ppid=3720 pid=3731 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:13.096000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3866386238353230623739356536303732663234303232616139343231 Jan 14 01:04:13.097000 audit: BPF prog-id=156 op=UNLOAD Jan 14 01:04:13.097000 audit[3731]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3720 pid=3731 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:13.097000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3866386238353230623739356536303732663234303232616139343231 Jan 14 01:04:13.097000 audit: BPF prog-id=157 op=LOAD Jan 14 01:04:13.097000 audit[3731]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000106488 a2=98 a3=0 items=0 ppid=3720 pid=3731 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:13.097000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3866386238353230623739356536303732663234303232616139343231 Jan 14 01:04:13.097000 audit: BPF prog-id=158 op=LOAD Jan 14 01:04:13.097000 audit[3731]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000106218 a2=98 a3=0 items=0 ppid=3720 pid=3731 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:13.097000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3866386238353230623739356536303732663234303232616139343231 Jan 14 01:04:13.097000 audit: BPF prog-id=158 op=UNLOAD Jan 14 01:04:13.097000 audit[3731]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3720 pid=3731 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:13.097000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3866386238353230623739356536303732663234303232616139343231 Jan 14 01:04:13.097000 audit: BPF prog-id=157 op=UNLOAD Jan 14 01:04:13.097000 audit[3731]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3720 pid=3731 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:13.097000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3866386238353230623739356536303732663234303232616139343231 Jan 14 01:04:13.097000 audit: BPF prog-id=159 op=LOAD Jan 14 01:04:13.097000 audit[3731]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001066e8 a2=98 a3=0 items=0 ppid=3720 pid=3731 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:13.097000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3866386238353230623739356536303732663234303232616139343231 Jan 14 01:04:13.140769 containerd[1695]: time="2026-01-14T01:04:13.140722932Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5d494f46bb-dwgld,Uid:1f0dbfc0-64a7-487a-a734-2ea83d1c3b27,Namespace:calico-system,Attempt:0,} returns sandbox id \"8f8b8520b795e6072f24022aa942189bc4b11d9b8b262760089244acd6c6d6ab\"" Jan 14 01:04:13.141399 kubelet[3316]: E0114 01:04:13.141307 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.141399 kubelet[3316]: W0114 01:04:13.141322 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.141399 kubelet[3316]: E0114 01:04:13.141340 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.141675 kubelet[3316]: E0114 01:04:13.141611 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.141675 kubelet[3316]: W0114 01:04:13.141618 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.141675 kubelet[3316]: E0114 01:04:13.141627 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.141807 kubelet[3316]: E0114 01:04:13.141801 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.141850 kubelet[3316]: W0114 01:04:13.141844 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.141916 kubelet[3316]: E0114 01:04:13.141900 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.142365 containerd[1695]: time="2026-01-14T01:04:13.142329908Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\"" Jan 14 01:04:13.142520 kubelet[3316]: E0114 01:04:13.142511 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.142571 kubelet[3316]: W0114 01:04:13.142559 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.142722 kubelet[3316]: E0114 01:04:13.142651 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.142900 kubelet[3316]: E0114 01:04:13.142892 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.143032 kubelet[3316]: W0114 01:04:13.142942 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.143032 kubelet[3316]: E0114 01:04:13.142969 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.143168 kubelet[3316]: E0114 01:04:13.143153 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.143203 kubelet[3316]: W0114 01:04:13.143197 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.143256 kubelet[3316]: E0114 01:04:13.143250 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.143410 kubelet[3316]: E0114 01:04:13.143404 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.143467 kubelet[3316]: W0114 01:04:13.143443 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.143467 kubelet[3316]: E0114 01:04:13.143451 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.143726 kubelet[3316]: E0114 01:04:13.143677 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.143726 kubelet[3316]: W0114 01:04:13.143685 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.143726 kubelet[3316]: E0114 01:04:13.143692 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.144067 kubelet[3316]: E0114 01:04:13.144018 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.144067 kubelet[3316]: W0114 01:04:13.144026 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.144067 kubelet[3316]: E0114 01:04:13.144033 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.144350 kubelet[3316]: E0114 01:04:13.144299 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.144481 kubelet[3316]: W0114 01:04:13.144400 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.144481 kubelet[3316]: E0114 01:04:13.144445 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.144903 kubelet[3316]: E0114 01:04:13.144892 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.145021 kubelet[3316]: W0114 01:04:13.144952 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.145021 kubelet[3316]: E0114 01:04:13.144964 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.145329 kubelet[3316]: E0114 01:04:13.145320 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.146000 kubelet[3316]: W0114 01:04:13.145877 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.146000 kubelet[3316]: E0114 01:04:13.145894 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.146380 kubelet[3316]: E0114 01:04:13.146311 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.146380 kubelet[3316]: W0114 01:04:13.146319 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.146380 kubelet[3316]: E0114 01:04:13.146327 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.146603 kubelet[3316]: E0114 01:04:13.146541 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.146603 kubelet[3316]: W0114 01:04:13.146551 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.146603 kubelet[3316]: E0114 01:04:13.146558 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.146773 kubelet[3316]: E0114 01:04:13.146755 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.146851 kubelet[3316]: W0114 01:04:13.146805 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.146851 kubelet[3316]: E0114 01:04:13.146814 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.147002 kubelet[3316]: E0114 01:04:13.146988 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.147122 kubelet[3316]: W0114 01:04:13.147038 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.147122 kubelet[3316]: E0114 01:04:13.147062 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.147343 kubelet[3316]: E0114 01:04:13.147208 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.147343 kubelet[3316]: W0114 01:04:13.147214 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.147343 kubelet[3316]: E0114 01:04:13.147223 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.149078 kubelet[3316]: E0114 01:04:13.147532 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.149078 kubelet[3316]: W0114 01:04:13.147539 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.149078 kubelet[3316]: E0114 01:04:13.147546 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.149348 kubelet[3316]: E0114 01:04:13.149307 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.149348 kubelet[3316]: W0114 01:04:13.149317 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.149348 kubelet[3316]: E0114 01:04:13.149328 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.149636 kubelet[3316]: E0114 01:04:13.149627 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.149702 kubelet[3316]: W0114 01:04:13.149694 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.149885 kubelet[3316]: E0114 01:04:13.149734 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.150031 kubelet[3316]: E0114 01:04:13.150023 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.150184 kubelet[3316]: W0114 01:04:13.150085 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.150184 kubelet[3316]: E0114 01:04:13.150095 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.150184 kubelet[3316]: I0114 01:04:13.150117 3316 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/4395ad87-346f-47f3-8e06-f63944f13a5d-varrun\") pod \"csi-node-driver-gdh9l\" (UID: \"4395ad87-346f-47f3-8e06-f63944f13a5d\") " pod="calico-system/csi-node-driver-gdh9l" Jan 14 01:04:13.150360 kubelet[3316]: E0114 01:04:13.150335 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.150360 kubelet[3316]: W0114 01:04:13.150344 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.150426 kubelet[3316]: E0114 01:04:13.150420 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.150470 kubelet[3316]: I0114 01:04:13.150463 3316 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnksh\" (UniqueName: \"kubernetes.io/projected/4395ad87-346f-47f3-8e06-f63944f13a5d-kube-api-access-bnksh\") pod \"csi-node-driver-gdh9l\" (UID: \"4395ad87-346f-47f3-8e06-f63944f13a5d\") " pod="calico-system/csi-node-driver-gdh9l" Jan 14 01:04:13.150740 kubelet[3316]: E0114 01:04:13.150725 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.150776 kubelet[3316]: W0114 01:04:13.150740 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.150776 kubelet[3316]: E0114 01:04:13.150754 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.150977 kubelet[3316]: E0114 01:04:13.150966 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.151077 kubelet[3316]: W0114 01:04:13.151065 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.151104 kubelet[3316]: E0114 01:04:13.151081 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.151555 kubelet[3316]: E0114 01:04:13.151542 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.151555 kubelet[3316]: W0114 01:04:13.151553 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.151618 kubelet[3316]: E0114 01:04:13.151562 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.151618 kubelet[3316]: I0114 01:04:13.151582 3316 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/4395ad87-346f-47f3-8e06-f63944f13a5d-registration-dir\") pod \"csi-node-driver-gdh9l\" (UID: \"4395ad87-346f-47f3-8e06-f63944f13a5d\") " pod="calico-system/csi-node-driver-gdh9l" Jan 14 01:04:13.151816 kubelet[3316]: E0114 01:04:13.151805 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.151816 kubelet[3316]: W0114 01:04:13.151815 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.151871 kubelet[3316]: E0114 01:04:13.151823 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.151994 kubelet[3316]: I0114 01:04:13.151982 3316 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/4395ad87-346f-47f3-8e06-f63944f13a5d-socket-dir\") pod \"csi-node-driver-gdh9l\" (UID: \"4395ad87-346f-47f3-8e06-f63944f13a5d\") " pod="calico-system/csi-node-driver-gdh9l" Jan 14 01:04:13.152298 kubelet[3316]: E0114 01:04:13.152281 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.152298 kubelet[3316]: W0114 01:04:13.152294 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.152364 kubelet[3316]: E0114 01:04:13.152303 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.152456 kubelet[3316]: E0114 01:04:13.152447 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.152480 kubelet[3316]: W0114 01:04:13.152456 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.152480 kubelet[3316]: E0114 01:04:13.152463 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.152784 kubelet[3316]: E0114 01:04:13.152729 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.152784 kubelet[3316]: W0114 01:04:13.152740 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.152784 kubelet[3316]: E0114 01:04:13.152749 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.152784 kubelet[3316]: I0114 01:04:13.152770 3316 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4395ad87-346f-47f3-8e06-f63944f13a5d-kubelet-dir\") pod \"csi-node-driver-gdh9l\" (UID: \"4395ad87-346f-47f3-8e06-f63944f13a5d\") " pod="calico-system/csi-node-driver-gdh9l" Jan 14 01:04:13.153623 kubelet[3316]: E0114 01:04:13.153602 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.153623 kubelet[3316]: W0114 01:04:13.153618 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.153695 kubelet[3316]: E0114 01:04:13.153628 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.154121 kubelet[3316]: E0114 01:04:13.154108 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.154121 kubelet[3316]: W0114 01:04:13.154118 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.154183 kubelet[3316]: E0114 01:04:13.154127 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.154286 kubelet[3316]: E0114 01:04:13.154275 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.154286 kubelet[3316]: W0114 01:04:13.154284 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.154329 kubelet[3316]: E0114 01:04:13.154291 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.154616 kubelet[3316]: E0114 01:04:13.154604 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.154616 kubelet[3316]: W0114 01:04:13.154614 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.154677 kubelet[3316]: E0114 01:04:13.154621 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.155201 kubelet[3316]: E0114 01:04:13.155189 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.155201 kubelet[3316]: W0114 01:04:13.155200 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.155259 kubelet[3316]: E0114 01:04:13.155208 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.155356 kubelet[3316]: E0114 01:04:13.155347 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.155356 kubelet[3316]: W0114 01:04:13.155354 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.155426 kubelet[3316]: E0114 01:04:13.155361 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.177391 containerd[1695]: time="2026-01-14T01:04:13.177351776Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-mw7w8,Uid:53c39008-c4eb-4eaf-983f-cc6c5c3edb37,Namespace:calico-system,Attempt:0,}" Jan 14 01:04:13.205250 containerd[1695]: time="2026-01-14T01:04:13.205138122Z" level=info msg="connecting to shim b08cd74b7cd1e5e9af53f88d44fdfd4a375fd24d3f7460322c2d5f450bf0a431" address="unix:///run/containerd/s/c4d74ac0321e631164279ed19b87f243676783e48bbefab59d82e4fcba016a17" namespace=k8s.io protocol=ttrpc version=3 Jan 14 01:04:13.229253 systemd[1]: Started cri-containerd-b08cd74b7cd1e5e9af53f88d44fdfd4a375fd24d3f7460322c2d5f450bf0a431.scope - libcontainer container b08cd74b7cd1e5e9af53f88d44fdfd4a375fd24d3f7460322c2d5f450bf0a431. Jan 14 01:04:13.236000 audit: BPF prog-id=160 op=LOAD Jan 14 01:04:13.237000 audit: BPF prog-id=161 op=LOAD Jan 14 01:04:13.237000 audit[3851]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000106238 a2=98 a3=0 items=0 ppid=3840 pid=3851 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:13.237000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6230386364373462376364316535653961663533663838643434666466 Jan 14 01:04:13.237000 audit: BPF prog-id=161 op=UNLOAD Jan 14 01:04:13.237000 audit[3851]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3840 pid=3851 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:13.237000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6230386364373462376364316535653961663533663838643434666466 Jan 14 01:04:13.237000 audit: BPF prog-id=162 op=LOAD Jan 14 01:04:13.237000 audit[3851]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000106488 a2=98 a3=0 items=0 ppid=3840 pid=3851 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:13.237000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6230386364373462376364316535653961663533663838643434666466 Jan 14 01:04:13.237000 audit: BPF prog-id=163 op=LOAD Jan 14 01:04:13.237000 audit[3851]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000106218 a2=98 a3=0 items=0 ppid=3840 pid=3851 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:13.237000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6230386364373462376364316535653961663533663838643434666466 Jan 14 01:04:13.237000 audit: BPF prog-id=163 op=UNLOAD Jan 14 01:04:13.237000 audit[3851]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3840 pid=3851 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:13.237000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6230386364373462376364316535653961663533663838643434666466 Jan 14 01:04:13.237000 audit: BPF prog-id=162 op=UNLOAD Jan 14 01:04:13.237000 audit[3851]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3840 pid=3851 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:13.237000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6230386364373462376364316535653961663533663838643434666466 Jan 14 01:04:13.237000 audit: BPF prog-id=164 op=LOAD Jan 14 01:04:13.237000 audit[3851]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001066e8 a2=98 a3=0 items=0 ppid=3840 pid=3851 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:13.237000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6230386364373462376364316535653961663533663838643434666466 Jan 14 01:04:13.255056 kubelet[3316]: E0114 01:04:13.255024 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.255336 containerd[1695]: time="2026-01-14T01:04:13.255305380Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-mw7w8,Uid:53c39008-c4eb-4eaf-983f-cc6c5c3edb37,Namespace:calico-system,Attempt:0,} returns sandbox id \"b08cd74b7cd1e5e9af53f88d44fdfd4a375fd24d3f7460322c2d5f450bf0a431\"" Jan 14 01:04:13.255544 kubelet[3316]: W0114 01:04:13.255040 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.255544 kubelet[3316]: E0114 01:04:13.255405 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.255652 kubelet[3316]: E0114 01:04:13.255579 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.255652 kubelet[3316]: W0114 01:04:13.255585 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.255652 kubelet[3316]: E0114 01:04:13.255593 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.256156 kubelet[3316]: E0114 01:04:13.255720 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.256156 kubelet[3316]: W0114 01:04:13.255728 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.256156 kubelet[3316]: E0114 01:04:13.255735 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.256156 kubelet[3316]: E0114 01:04:13.255864 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.256156 kubelet[3316]: W0114 01:04:13.255870 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.256156 kubelet[3316]: E0114 01:04:13.255876 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.256156 kubelet[3316]: E0114 01:04:13.255981 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.256156 kubelet[3316]: W0114 01:04:13.255986 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.256156 kubelet[3316]: E0114 01:04:13.255991 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.256368 kubelet[3316]: E0114 01:04:13.256169 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.256368 kubelet[3316]: W0114 01:04:13.256175 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.256368 kubelet[3316]: E0114 01:04:13.256181 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.256368 kubelet[3316]: E0114 01:04:13.256304 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.256368 kubelet[3316]: W0114 01:04:13.256309 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.256368 kubelet[3316]: E0114 01:04:13.256316 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.256483 kubelet[3316]: E0114 01:04:13.256426 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.256483 kubelet[3316]: W0114 01:04:13.256431 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.256483 kubelet[3316]: E0114 01:04:13.256436 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.256736 kubelet[3316]: E0114 01:04:13.256725 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.256736 kubelet[3316]: W0114 01:04:13.256734 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.256807 kubelet[3316]: E0114 01:04:13.256741 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.256896 kubelet[3316]: E0114 01:04:13.256887 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.256896 kubelet[3316]: W0114 01:04:13.256895 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.256937 kubelet[3316]: E0114 01:04:13.256901 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.257022 kubelet[3316]: E0114 01:04:13.257005 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.257022 kubelet[3316]: W0114 01:04:13.257021 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.257076 kubelet[3316]: E0114 01:04:13.257028 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.257155 kubelet[3316]: E0114 01:04:13.257147 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.257155 kubelet[3316]: W0114 01:04:13.257155 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.257198 kubelet[3316]: E0114 01:04:13.257161 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.257456 kubelet[3316]: E0114 01:04:13.257280 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.257456 kubelet[3316]: W0114 01:04:13.257289 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.257456 kubelet[3316]: E0114 01:04:13.257294 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.257659 kubelet[3316]: E0114 01:04:13.257461 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.257659 kubelet[3316]: W0114 01:04:13.257467 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.257659 kubelet[3316]: E0114 01:04:13.257473 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.257659 kubelet[3316]: E0114 01:04:13.257583 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.257659 kubelet[3316]: W0114 01:04:13.257599 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.257659 kubelet[3316]: E0114 01:04:13.257605 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.258224 kubelet[3316]: E0114 01:04:13.257715 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.258224 kubelet[3316]: W0114 01:04:13.257720 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.258224 kubelet[3316]: E0114 01:04:13.257726 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.258224 kubelet[3316]: E0114 01:04:13.257856 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.258224 kubelet[3316]: W0114 01:04:13.257861 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.258224 kubelet[3316]: E0114 01:04:13.257867 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.258224 kubelet[3316]: E0114 01:04:13.258221 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.258224 kubelet[3316]: W0114 01:04:13.258227 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.258463 kubelet[3316]: E0114 01:04:13.258235 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.258463 kubelet[3316]: E0114 01:04:13.258356 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.258463 kubelet[3316]: W0114 01:04:13.258361 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.258463 kubelet[3316]: E0114 01:04:13.258367 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.259486 kubelet[3316]: E0114 01:04:13.258494 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.259486 kubelet[3316]: W0114 01:04:13.258499 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.259486 kubelet[3316]: E0114 01:04:13.258505 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.259486 kubelet[3316]: E0114 01:04:13.258608 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.259486 kubelet[3316]: W0114 01:04:13.258612 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.259486 kubelet[3316]: E0114 01:04:13.258617 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.259486 kubelet[3316]: E0114 01:04:13.258709 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.259486 kubelet[3316]: W0114 01:04:13.258713 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.259486 kubelet[3316]: E0114 01:04:13.258718 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.259486 kubelet[3316]: E0114 01:04:13.258835 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.259667 kubelet[3316]: W0114 01:04:13.258848 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.259667 kubelet[3316]: E0114 01:04:13.258853 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.259667 kubelet[3316]: E0114 01:04:13.258956 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.259667 kubelet[3316]: W0114 01:04:13.258960 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.259667 kubelet[3316]: E0114 01:04:13.258965 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.259667 kubelet[3316]: E0114 01:04:13.259083 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.259667 kubelet[3316]: W0114 01:04:13.259087 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.259667 kubelet[3316]: E0114 01:04:13.259092 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.269753 kubelet[3316]: E0114 01:04:13.269706 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:13.269753 kubelet[3316]: W0114 01:04:13.269720 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:13.269753 kubelet[3316]: E0114 01:04:13.269732 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:13.668000 audit[3905]: NETFILTER_CFG table=filter:113 family=2 entries=22 op=nft_register_rule pid=3905 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:04:13.670456 kernel: kauditd_printk_skb: 63 callbacks suppressed Jan 14 01:04:13.670500 kernel: audit: type=1325 audit(1768352653.668:562): table=filter:113 family=2 entries=22 op=nft_register_rule pid=3905 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:04:13.668000 audit[3905]: SYSCALL arch=c000003e syscall=46 success=yes exit=8224 a0=3 a1=7ffe4ac66c00 a2=0 a3=7ffe4ac66bec items=0 ppid=3467 pid=3905 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:13.680073 kernel: audit: type=1300 audit(1768352653.668:562): arch=c000003e syscall=46 success=yes exit=8224 a0=3 a1=7ffe4ac66c00 a2=0 a3=7ffe4ac66bec items=0 ppid=3467 pid=3905 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:13.668000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:04:13.674000 audit[3905]: NETFILTER_CFG table=nat:114 family=2 entries=12 op=nft_register_rule pid=3905 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:04:13.683938 kernel: audit: type=1327 audit(1768352653.668:562): proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:04:13.683970 kernel: audit: type=1325 audit(1768352653.674:563): table=nat:114 family=2 entries=12 op=nft_register_rule pid=3905 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:04:13.674000 audit[3905]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffe4ac66c00 a2=0 a3=0 items=0 ppid=3467 pid=3905 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:13.686946 kernel: audit: type=1300 audit(1768352653.674:563): arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffe4ac66c00 a2=0 a3=0 items=0 ppid=3467 pid=3905 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:13.674000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:04:13.691114 kernel: audit: type=1327 audit(1768352653.674:563): proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:04:14.610985 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1114671585.mount: Deactivated successfully. Jan 14 01:04:14.855706 kubelet[3316]: E0114 01:04:14.855643 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-gdh9l" podUID="4395ad87-346f-47f3-8e06-f63944f13a5d" Jan 14 01:04:15.544886 containerd[1695]: time="2026-01-14T01:04:15.544843490Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:04:15.546399 containerd[1695]: time="2026-01-14T01:04:15.546358913Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.4: active requests=0, bytes read=33735893" Jan 14 01:04:15.547812 containerd[1695]: time="2026-01-14T01:04:15.547768013Z" level=info msg="ImageCreate event name:\"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:04:15.549934 containerd[1695]: time="2026-01-14T01:04:15.549898912Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:04:15.550450 containerd[1695]: time="2026-01-14T01:04:15.550374214Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.4\" with image id \"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\", size \"35234482\" in 2.407940469s" Jan 14 01:04:15.550450 containerd[1695]: time="2026-01-14T01:04:15.550395125Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\" returns image reference \"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\"" Jan 14 01:04:15.551688 containerd[1695]: time="2026-01-14T01:04:15.551665968Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\"" Jan 14 01:04:15.566239 containerd[1695]: time="2026-01-14T01:04:15.565562628Z" level=info msg="CreateContainer within sandbox \"8f8b8520b795e6072f24022aa942189bc4b11d9b8b262760089244acd6c6d6ab\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Jan 14 01:04:15.577780 containerd[1695]: time="2026-01-14T01:04:15.577342810Z" level=info msg="Container 02cf34fad0271e8b0ed026836983bd5651d45ec0470110f4e174171b38e1308f: CDI devices from CRI Config.CDIDevices: []" Jan 14 01:04:15.580943 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount860583976.mount: Deactivated successfully. Jan 14 01:04:15.589776 containerd[1695]: time="2026-01-14T01:04:15.589723310Z" level=info msg="CreateContainer within sandbox \"8f8b8520b795e6072f24022aa942189bc4b11d9b8b262760089244acd6c6d6ab\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"02cf34fad0271e8b0ed026836983bd5651d45ec0470110f4e174171b38e1308f\"" Jan 14 01:04:15.590368 containerd[1695]: time="2026-01-14T01:04:15.590351289Z" level=info msg="StartContainer for \"02cf34fad0271e8b0ed026836983bd5651d45ec0470110f4e174171b38e1308f\"" Jan 14 01:04:15.592637 containerd[1695]: time="2026-01-14T01:04:15.592561572Z" level=info msg="connecting to shim 02cf34fad0271e8b0ed026836983bd5651d45ec0470110f4e174171b38e1308f" address="unix:///run/containerd/s/90822588ff8813df799bdfbae2980ac32117aabe5c4cadd9c620526a1008cfb4" protocol=ttrpc version=3 Jan 14 01:04:15.613268 systemd[1]: Started cri-containerd-02cf34fad0271e8b0ed026836983bd5651d45ec0470110f4e174171b38e1308f.scope - libcontainer container 02cf34fad0271e8b0ed026836983bd5651d45ec0470110f4e174171b38e1308f. Jan 14 01:04:15.624000 audit: BPF prog-id=165 op=LOAD Jan 14 01:04:15.626000 audit: BPF prog-id=166 op=LOAD Jan 14 01:04:15.628250 kernel: audit: type=1334 audit(1768352655.624:564): prog-id=165 op=LOAD Jan 14 01:04:15.628296 kernel: audit: type=1334 audit(1768352655.626:565): prog-id=166 op=LOAD Jan 14 01:04:15.626000 audit[3916]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=3720 pid=3916 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:15.630781 kernel: audit: type=1300 audit(1768352655.626:565): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=3720 pid=3916 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:15.626000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3032636633346661643032373165386230656430323638333639383362 Jan 14 01:04:15.634794 kernel: audit: type=1327 audit(1768352655.626:565): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3032636633346661643032373165386230656430323638333639383362 Jan 14 01:04:15.626000 audit: BPF prog-id=166 op=UNLOAD Jan 14 01:04:15.626000 audit[3916]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3720 pid=3916 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:15.626000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3032636633346661643032373165386230656430323638333639383362 Jan 14 01:04:15.626000 audit: BPF prog-id=167 op=LOAD Jan 14 01:04:15.626000 audit[3916]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=3720 pid=3916 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:15.626000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3032636633346661643032373165386230656430323638333639383362 Jan 14 01:04:15.626000 audit: BPF prog-id=168 op=LOAD Jan 14 01:04:15.626000 audit[3916]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=3720 pid=3916 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:15.626000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3032636633346661643032373165386230656430323638333639383362 Jan 14 01:04:15.626000 audit: BPF prog-id=168 op=UNLOAD Jan 14 01:04:15.626000 audit[3916]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3720 pid=3916 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:15.626000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3032636633346661643032373165386230656430323638333639383362 Jan 14 01:04:15.626000 audit: BPF prog-id=167 op=UNLOAD Jan 14 01:04:15.626000 audit[3916]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3720 pid=3916 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:15.626000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3032636633346661643032373165386230656430323638333639383362 Jan 14 01:04:15.626000 audit: BPF prog-id=169 op=LOAD Jan 14 01:04:15.626000 audit[3916]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=3720 pid=3916 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:15.626000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3032636633346661643032373165386230656430323638333639383362 Jan 14 01:04:15.674108 containerd[1695]: time="2026-01-14T01:04:15.674076607Z" level=info msg="StartContainer for \"02cf34fad0271e8b0ed026836983bd5651d45ec0470110f4e174171b38e1308f\" returns successfully" Jan 14 01:04:15.961578 kubelet[3316]: I0114 01:04:15.961533 3316 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-5d494f46bb-dwgld" podStartSLOduration=1.552434646 podStartE2EDuration="3.961521547s" podCreationTimestamp="2026-01-14 01:04:12 +0000 UTC" firstStartedPulling="2026-01-14 01:04:13.142076014 +0000 UTC m=+19.393739299" lastFinishedPulling="2026-01-14 01:04:15.551162914 +0000 UTC m=+21.802826200" observedRunningTime="2026-01-14 01:04:15.961091308 +0000 UTC m=+22.212754611" watchObservedRunningTime="2026-01-14 01:04:15.961521547 +0000 UTC m=+22.213184856" Jan 14 01:04:15.965751 kubelet[3316]: E0114 01:04:15.965698 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:15.965751 kubelet[3316]: W0114 01:04:15.965716 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:15.965751 kubelet[3316]: E0114 01:04:15.965733 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:15.966043 kubelet[3316]: E0114 01:04:15.966035 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:15.967190 kubelet[3316]: W0114 01:04:15.967087 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:15.967190 kubelet[3316]: E0114 01:04:15.967102 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:15.967464 kubelet[3316]: E0114 01:04:15.967358 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:15.967464 kubelet[3316]: W0114 01:04:15.967366 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:15.967464 kubelet[3316]: E0114 01:04:15.967374 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:15.967569 kubelet[3316]: E0114 01:04:15.967563 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:15.967600 kubelet[3316]: W0114 01:04:15.967594 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:15.967710 kubelet[3316]: E0114 01:04:15.967622 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:15.967768 kubelet[3316]: E0114 01:04:15.967763 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:15.967801 kubelet[3316]: W0114 01:04:15.967797 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:15.967887 kubelet[3316]: E0114 01:04:15.967819 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:15.967942 kubelet[3316]: E0114 01:04:15.967937 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:15.968063 kubelet[3316]: W0114 01:04:15.967977 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:15.968063 kubelet[3316]: E0114 01:04:15.967993 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:15.968149 kubelet[3316]: E0114 01:04:15.968143 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:15.968194 kubelet[3316]: W0114 01:04:15.968176 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:15.968194 kubelet[3316]: E0114 01:04:15.968184 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:15.968426 kubelet[3316]: E0114 01:04:15.968349 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:15.968426 kubelet[3316]: W0114 01:04:15.968355 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:15.968426 kubelet[3316]: E0114 01:04:15.968362 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:15.968535 kubelet[3316]: E0114 01:04:15.968529 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:15.968566 kubelet[3316]: W0114 01:04:15.968561 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:15.968606 kubelet[3316]: E0114 01:04:15.968601 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:15.968812 kubelet[3316]: E0114 01:04:15.968752 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:15.968812 kubelet[3316]: W0114 01:04:15.968757 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:15.968812 kubelet[3316]: E0114 01:04:15.968763 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:15.968989 kubelet[3316]: E0114 01:04:15.968983 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:15.969023 kubelet[3316]: W0114 01:04:15.969018 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:15.969145 kubelet[3316]: E0114 01:04:15.969075 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:15.970114 kubelet[3316]: E0114 01:04:15.970102 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:15.970191 kubelet[3316]: W0114 01:04:15.970183 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:15.970308 kubelet[3316]: E0114 01:04:15.970236 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:15.970438 kubelet[3316]: E0114 01:04:15.970432 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:15.970485 kubelet[3316]: W0114 01:04:15.970470 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:15.970573 kubelet[3316]: E0114 01:04:15.970478 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:15.970638 kubelet[3316]: E0114 01:04:15.970633 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:15.970677 kubelet[3316]: W0114 01:04:15.970667 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:15.970768 kubelet[3316]: E0114 01:04:15.970708 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:15.970831 kubelet[3316]: E0114 01:04:15.970825 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:15.970873 kubelet[3316]: W0114 01:04:15.970857 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:15.970950 kubelet[3316]: E0114 01:04:15.970864 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:15.974198 kubelet[3316]: E0114 01:04:15.974183 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:15.974198 kubelet[3316]: W0114 01:04:15.974195 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:15.974305 kubelet[3316]: E0114 01:04:15.974206 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:15.974352 kubelet[3316]: E0114 01:04:15.974345 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:15.974373 kubelet[3316]: W0114 01:04:15.974353 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:15.974373 kubelet[3316]: E0114 01:04:15.974359 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:15.974515 kubelet[3316]: E0114 01:04:15.974505 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:15.974537 kubelet[3316]: W0114 01:04:15.974515 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:15.974537 kubelet[3316]: E0114 01:04:15.974522 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:15.974645 kubelet[3316]: E0114 01:04:15.974637 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:15.974645 kubelet[3316]: W0114 01:04:15.974642 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:15.974691 kubelet[3316]: E0114 01:04:15.974647 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:15.974773 kubelet[3316]: E0114 01:04:15.974765 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:15.974803 kubelet[3316]: W0114 01:04:15.974773 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:15.974803 kubelet[3316]: E0114 01:04:15.974779 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:15.974910 kubelet[3316]: E0114 01:04:15.974899 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:15.974910 kubelet[3316]: W0114 01:04:15.974907 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:15.974947 kubelet[3316]: E0114 01:04:15.974913 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:15.975165 kubelet[3316]: E0114 01:04:15.975135 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:15.975165 kubelet[3316]: W0114 01:04:15.975149 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:15.975380 kubelet[3316]: E0114 01:04:15.975233 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:15.975404 kubelet[3316]: E0114 01:04:15.975385 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:15.975404 kubelet[3316]: W0114 01:04:15.975393 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:15.975404 kubelet[3316]: E0114 01:04:15.975402 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:15.975579 kubelet[3316]: E0114 01:04:15.975570 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:15.975579 kubelet[3316]: W0114 01:04:15.975578 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:15.975624 kubelet[3316]: E0114 01:04:15.975584 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:15.975731 kubelet[3316]: E0114 01:04:15.975715 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:15.975731 kubelet[3316]: W0114 01:04:15.975723 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:15.975731 kubelet[3316]: E0114 01:04:15.975729 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:15.975840 kubelet[3316]: E0114 01:04:15.975833 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:15.975859 kubelet[3316]: W0114 01:04:15.975840 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:15.975881 kubelet[3316]: E0114 01:04:15.975860 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:15.975999 kubelet[3316]: E0114 01:04:15.975991 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:15.976018 kubelet[3316]: W0114 01:04:15.976007 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:15.976018 kubelet[3316]: E0114 01:04:15.976013 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:15.976245 kubelet[3316]: E0114 01:04:15.976236 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:15.976245 kubelet[3316]: W0114 01:04:15.976245 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:15.976293 kubelet[3316]: E0114 01:04:15.976251 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:15.976434 kubelet[3316]: E0114 01:04:15.976424 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:15.976434 kubelet[3316]: W0114 01:04:15.976433 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:15.976478 kubelet[3316]: E0114 01:04:15.976440 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:15.976564 kubelet[3316]: E0114 01:04:15.976547 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:15.976564 kubelet[3316]: W0114 01:04:15.976555 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:15.976564 kubelet[3316]: E0114 01:04:15.976561 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:15.977129 kubelet[3316]: E0114 01:04:15.976668 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:15.977129 kubelet[3316]: W0114 01:04:15.976673 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:15.977129 kubelet[3316]: E0114 01:04:15.976678 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:15.977129 kubelet[3316]: E0114 01:04:15.976899 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:15.977129 kubelet[3316]: W0114 01:04:15.976904 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:15.977129 kubelet[3316]: E0114 01:04:15.976911 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:15.977129 kubelet[3316]: E0114 01:04:15.977064 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:15.977129 kubelet[3316]: W0114 01:04:15.977070 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:15.977129 kubelet[3316]: E0114 01:04:15.977076 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:16.855299 kubelet[3316]: E0114 01:04:16.855168 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-gdh9l" podUID="4395ad87-346f-47f3-8e06-f63944f13a5d" Jan 14 01:04:16.936973 kubelet[3316]: I0114 01:04:16.936887 3316 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 14 01:04:16.975745 kubelet[3316]: E0114 01:04:16.975718 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:16.976189 kubelet[3316]: W0114 01:04:16.975756 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:16.976189 kubelet[3316]: E0114 01:04:16.975774 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:16.976189 kubelet[3316]: E0114 01:04:16.975935 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:16.976189 kubelet[3316]: W0114 01:04:16.975941 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:16.976189 kubelet[3316]: E0114 01:04:16.975947 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:16.976189 kubelet[3316]: E0114 01:04:16.976096 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:16.976189 kubelet[3316]: W0114 01:04:16.976102 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:16.976189 kubelet[3316]: E0114 01:04:16.976107 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:16.976383 kubelet[3316]: E0114 01:04:16.976232 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:16.976383 kubelet[3316]: W0114 01:04:16.976237 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:16.976383 kubelet[3316]: E0114 01:04:16.976243 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:16.976383 kubelet[3316]: E0114 01:04:16.976369 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:16.976383 kubelet[3316]: W0114 01:04:16.976374 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:16.976383 kubelet[3316]: E0114 01:04:16.976379 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:16.976502 kubelet[3316]: E0114 01:04:16.976486 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:16.976502 kubelet[3316]: W0114 01:04:16.976490 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:16.976502 kubelet[3316]: E0114 01:04:16.976495 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:16.976618 kubelet[3316]: E0114 01:04:16.976610 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:16.976618 kubelet[3316]: W0114 01:04:16.976617 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:16.976664 kubelet[3316]: E0114 01:04:16.976622 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:16.976751 kubelet[3316]: E0114 01:04:16.976742 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:16.976774 kubelet[3316]: W0114 01:04:16.976751 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:16.976774 kubelet[3316]: E0114 01:04:16.976757 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:16.976888 kubelet[3316]: E0114 01:04:16.976880 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:16.976888 kubelet[3316]: W0114 01:04:16.976887 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:16.976930 kubelet[3316]: E0114 01:04:16.976893 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:16.977019 kubelet[3316]: E0114 01:04:16.977011 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:16.977040 kubelet[3316]: W0114 01:04:16.977019 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:16.977040 kubelet[3316]: E0114 01:04:16.977025 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:16.977151 kubelet[3316]: E0114 01:04:16.977143 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:16.977151 kubelet[3316]: W0114 01:04:16.977150 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:16.977193 kubelet[3316]: E0114 01:04:16.977156 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:16.977270 kubelet[3316]: E0114 01:04:16.977263 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:16.977270 kubelet[3316]: W0114 01:04:16.977270 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:16.977619 kubelet[3316]: E0114 01:04:16.977286 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:16.977801 kubelet[3316]: E0114 01:04:16.977791 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:16.977825 kubelet[3316]: W0114 01:04:16.977802 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:16.977825 kubelet[3316]: E0114 01:04:16.977812 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:16.977949 kubelet[3316]: E0114 01:04:16.977941 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:16.977949 kubelet[3316]: W0114 01:04:16.977948 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:16.977998 kubelet[3316]: E0114 01:04:16.977954 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:16.978110 kubelet[3316]: E0114 01:04:16.978103 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:16.978110 kubelet[3316]: W0114 01:04:16.978110 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:16.978150 kubelet[3316]: E0114 01:04:16.978115 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:16.980736 kubelet[3316]: E0114 01:04:16.980723 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:16.980736 kubelet[3316]: W0114 01:04:16.980732 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:16.980817 kubelet[3316]: E0114 01:04:16.980740 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:16.980904 kubelet[3316]: E0114 01:04:16.980896 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:16.980904 kubelet[3316]: W0114 01:04:16.980903 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:16.981069 kubelet[3316]: E0114 01:04:16.980922 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:16.981140 kubelet[3316]: E0114 01:04:16.981129 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:16.981175 kubelet[3316]: W0114 01:04:16.981167 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:16.981210 kubelet[3316]: E0114 01:04:16.981203 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:16.981430 kubelet[3316]: E0114 01:04:16.981365 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:16.981430 kubelet[3316]: W0114 01:04:16.981372 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:16.981430 kubelet[3316]: E0114 01:04:16.981379 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:16.981521 kubelet[3316]: E0114 01:04:16.981516 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:16.981553 kubelet[3316]: W0114 01:04:16.981548 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:16.981583 kubelet[3316]: E0114 01:04:16.981578 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:16.982033 kubelet[3316]: E0114 01:04:16.981741 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:16.982033 kubelet[3316]: W0114 01:04:16.981748 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:16.982033 kubelet[3316]: E0114 01:04:16.981753 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:16.982033 kubelet[3316]: E0114 01:04:16.981917 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:16.982033 kubelet[3316]: W0114 01:04:16.981923 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:16.982033 kubelet[3316]: E0114 01:04:16.981929 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:16.982189 kubelet[3316]: E0114 01:04:16.982108 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:16.982189 kubelet[3316]: W0114 01:04:16.982114 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:16.982189 kubelet[3316]: E0114 01:04:16.982120 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:16.982238 kubelet[3316]: E0114 01:04:16.982225 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:16.982238 kubelet[3316]: W0114 01:04:16.982229 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:16.982238 kubelet[3316]: E0114 01:04:16.982234 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:16.982356 kubelet[3316]: E0114 01:04:16.982348 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:16.982356 kubelet[3316]: W0114 01:04:16.982355 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:16.982392 kubelet[3316]: E0114 01:04:16.982360 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:16.982515 kubelet[3316]: E0114 01:04:16.982507 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:16.982515 kubelet[3316]: W0114 01:04:16.982514 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:16.982562 kubelet[3316]: E0114 01:04:16.982519 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:16.982776 kubelet[3316]: E0114 01:04:16.982691 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:16.982776 kubelet[3316]: W0114 01:04:16.982699 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:16.982776 kubelet[3316]: E0114 01:04:16.982707 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:16.982867 kubelet[3316]: E0114 01:04:16.982862 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:16.982894 kubelet[3316]: W0114 01:04:16.982890 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:16.982982 kubelet[3316]: E0114 01:04:16.982924 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:16.983081 kubelet[3316]: E0114 01:04:16.983075 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:16.983110 kubelet[3316]: W0114 01:04:16.983105 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:16.983191 kubelet[3316]: E0114 01:04:16.983139 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:16.983255 kubelet[3316]: E0114 01:04:16.983250 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:16.983282 kubelet[3316]: W0114 01:04:16.983278 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:16.983373 kubelet[3316]: E0114 01:04:16.983305 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:16.983446 kubelet[3316]: E0114 01:04:16.983441 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:16.983475 kubelet[3316]: W0114 01:04:16.983470 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:16.983605 kubelet[3316]: E0114 01:04:16.983497 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:16.983673 kubelet[3316]: E0114 01:04:16.983663 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:16.983673 kubelet[3316]: W0114 01:04:16.983671 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:16.983712 kubelet[3316]: E0114 01:04:16.983678 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:16.983806 kubelet[3316]: E0114 01:04:16.983799 3316 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:04:16.983806 kubelet[3316]: W0114 01:04:16.983806 3316 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:04:16.983844 kubelet[3316]: E0114 01:04:16.983811 3316 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:04:17.077790 containerd[1695]: time="2026-01-14T01:04:17.077294624Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:04:17.079638 containerd[1695]: time="2026-01-14T01:04:17.079617543Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4: active requests=0, bytes read=0" Jan 14 01:04:17.082094 containerd[1695]: time="2026-01-14T01:04:17.081698248Z" level=info msg="ImageCreate event name:\"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:04:17.083142 containerd[1695]: time="2026-01-14T01:04:17.083100681Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:04:17.083730 containerd[1695]: time="2026-01-14T01:04:17.083559465Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" with image id \"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\", size \"5941314\" in 1.531849625s" Jan 14 01:04:17.083730 containerd[1695]: time="2026-01-14T01:04:17.083584263Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" returns image reference \"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\"" Jan 14 01:04:17.088427 containerd[1695]: time="2026-01-14T01:04:17.088405424Z" level=info msg="CreateContainer within sandbox \"b08cd74b7cd1e5e9af53f88d44fdfd4a375fd24d3f7460322c2d5f450bf0a431\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Jan 14 01:04:17.100064 containerd[1695]: time="2026-01-14T01:04:17.100030329Z" level=info msg="Container 13ae228fc04e2b181aa1d18a958926e1f90923713d9d416649ba90c146095321: CDI devices from CRI Config.CDIDevices: []" Jan 14 01:04:17.109152 containerd[1695]: time="2026-01-14T01:04:17.109082857Z" level=info msg="CreateContainer within sandbox \"b08cd74b7cd1e5e9af53f88d44fdfd4a375fd24d3f7460322c2d5f450bf0a431\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"13ae228fc04e2b181aa1d18a958926e1f90923713d9d416649ba90c146095321\"" Jan 14 01:04:17.109985 containerd[1695]: time="2026-01-14T01:04:17.109950898Z" level=info msg="StartContainer for \"13ae228fc04e2b181aa1d18a958926e1f90923713d9d416649ba90c146095321\"" Jan 14 01:04:17.111570 containerd[1695]: time="2026-01-14T01:04:17.111547920Z" level=info msg="connecting to shim 13ae228fc04e2b181aa1d18a958926e1f90923713d9d416649ba90c146095321" address="unix:///run/containerd/s/c4d74ac0321e631164279ed19b87f243676783e48bbefab59d82e4fcba016a17" protocol=ttrpc version=3 Jan 14 01:04:17.138233 systemd[1]: Started cri-containerd-13ae228fc04e2b181aa1d18a958926e1f90923713d9d416649ba90c146095321.scope - libcontainer container 13ae228fc04e2b181aa1d18a958926e1f90923713d9d416649ba90c146095321. Jan 14 01:04:17.184000 audit: BPF prog-id=170 op=LOAD Jan 14 01:04:17.184000 audit[4026]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=3840 pid=4026 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:17.184000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3133616532323866633034653262313831616131643138613935383932 Jan 14 01:04:17.184000 audit: BPF prog-id=171 op=LOAD Jan 14 01:04:17.184000 audit[4026]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=3840 pid=4026 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:17.184000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3133616532323866633034653262313831616131643138613935383932 Jan 14 01:04:17.185000 audit: BPF prog-id=171 op=UNLOAD Jan 14 01:04:17.185000 audit[4026]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3840 pid=4026 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:17.185000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3133616532323866633034653262313831616131643138613935383932 Jan 14 01:04:17.185000 audit: BPF prog-id=170 op=UNLOAD Jan 14 01:04:17.185000 audit[4026]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3840 pid=4026 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:17.185000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3133616532323866633034653262313831616131643138613935383932 Jan 14 01:04:17.185000 audit: BPF prog-id=172 op=LOAD Jan 14 01:04:17.185000 audit[4026]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=3840 pid=4026 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:17.185000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3133616532323866633034653262313831616131643138613935383932 Jan 14 01:04:17.205672 containerd[1695]: time="2026-01-14T01:04:17.205633949Z" level=info msg="StartContainer for \"13ae228fc04e2b181aa1d18a958926e1f90923713d9d416649ba90c146095321\" returns successfully" Jan 14 01:04:17.217797 systemd[1]: cri-containerd-13ae228fc04e2b181aa1d18a958926e1f90923713d9d416649ba90c146095321.scope: Deactivated successfully. Jan 14 01:04:17.220668 containerd[1695]: time="2026-01-14T01:04:17.220628349Z" level=info msg="received container exit event container_id:\"13ae228fc04e2b181aa1d18a958926e1f90923713d9d416649ba90c146095321\" id:\"13ae228fc04e2b181aa1d18a958926e1f90923713d9d416649ba90c146095321\" pid:4038 exited_at:{seconds:1768352657 nanos:220307674}" Jan 14 01:04:17.222000 audit: BPF prog-id=172 op=UNLOAD Jan 14 01:04:17.250893 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-13ae228fc04e2b181aa1d18a958926e1f90923713d9d416649ba90c146095321-rootfs.mount: Deactivated successfully. Jan 14 01:04:18.855100 kubelet[3316]: E0114 01:04:18.855021 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-gdh9l" podUID="4395ad87-346f-47f3-8e06-f63944f13a5d" Jan 14 01:04:18.944902 containerd[1695]: time="2026-01-14T01:04:18.944600765Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\"" Jan 14 01:04:20.855272 kubelet[3316]: E0114 01:04:20.855209 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-gdh9l" podUID="4395ad87-346f-47f3-8e06-f63944f13a5d" Jan 14 01:04:22.736088 containerd[1695]: time="2026-01-14T01:04:22.735908000Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:04:22.738334 containerd[1695]: time="2026-01-14T01:04:22.738304240Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.4: active requests=0, bytes read=70442291" Jan 14 01:04:22.740144 containerd[1695]: time="2026-01-14T01:04:22.740107210Z" level=info msg="ImageCreate event name:\"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:04:22.744206 containerd[1695]: time="2026-01-14T01:04:22.744170938Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:04:22.744999 containerd[1695]: time="2026-01-14T01:04:22.744981508Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.4\" with image id \"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\", size \"71941459\" in 3.800334149s" Jan 14 01:04:22.746478 containerd[1695]: time="2026-01-14T01:04:22.746299612Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\" returns image reference \"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\"" Jan 14 01:04:22.751758 containerd[1695]: time="2026-01-14T01:04:22.751741137Z" level=info msg="CreateContainer within sandbox \"b08cd74b7cd1e5e9af53f88d44fdfd4a375fd24d3f7460322c2d5f450bf0a431\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Jan 14 01:04:22.765064 containerd[1695]: time="2026-01-14T01:04:22.764916602Z" level=info msg="Container c4a5b809b26841c945c90094d5df088b3b7f98b6736cdbc81c7840d4cc9cc38a: CDI devices from CRI Config.CDIDevices: []" Jan 14 01:04:22.780093 containerd[1695]: time="2026-01-14T01:04:22.780038315Z" level=info msg="CreateContainer within sandbox \"b08cd74b7cd1e5e9af53f88d44fdfd4a375fd24d3f7460322c2d5f450bf0a431\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"c4a5b809b26841c945c90094d5df088b3b7f98b6736cdbc81c7840d4cc9cc38a\"" Jan 14 01:04:22.781966 containerd[1695]: time="2026-01-14T01:04:22.781783424Z" level=info msg="StartContainer for \"c4a5b809b26841c945c90094d5df088b3b7f98b6736cdbc81c7840d4cc9cc38a\"" Jan 14 01:04:22.783772 containerd[1695]: time="2026-01-14T01:04:22.783749192Z" level=info msg="connecting to shim c4a5b809b26841c945c90094d5df088b3b7f98b6736cdbc81c7840d4cc9cc38a" address="unix:///run/containerd/s/c4d74ac0321e631164279ed19b87f243676783e48bbefab59d82e4fcba016a17" protocol=ttrpc version=3 Jan 14 01:04:22.805252 systemd[1]: Started cri-containerd-c4a5b809b26841c945c90094d5df088b3b7f98b6736cdbc81c7840d4cc9cc38a.scope - libcontainer container c4a5b809b26841c945c90094d5df088b3b7f98b6736cdbc81c7840d4cc9cc38a. Jan 14 01:04:22.855535 kubelet[3316]: E0114 01:04:22.855477 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-gdh9l" podUID="4395ad87-346f-47f3-8e06-f63944f13a5d" Jan 14 01:04:22.860000 audit: BPF prog-id=173 op=LOAD Jan 14 01:04:22.862621 kernel: kauditd_printk_skb: 34 callbacks suppressed Jan 14 01:04:22.862683 kernel: audit: type=1334 audit(1768352662.860:578): prog-id=173 op=LOAD Jan 14 01:04:22.860000 audit[4086]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001a0488 a2=98 a3=0 items=0 ppid=3840 pid=4086 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:22.866096 kernel: audit: type=1300 audit(1768352662.860:578): arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001a0488 a2=98 a3=0 items=0 ppid=3840 pid=4086 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:22.860000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6334613562383039623236383431633934356339303039346435646630 Jan 14 01:04:22.870022 kernel: audit: type=1327 audit(1768352662.860:578): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6334613562383039623236383431633934356339303039346435646630 Jan 14 01:04:22.861000 audit: BPF prog-id=174 op=LOAD Jan 14 01:04:22.873315 kernel: audit: type=1334 audit(1768352662.861:579): prog-id=174 op=LOAD Jan 14 01:04:22.861000 audit[4086]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c0001a0218 a2=98 a3=0 items=0 ppid=3840 pid=4086 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:22.875855 kernel: audit: type=1300 audit(1768352662.861:579): arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c0001a0218 a2=98 a3=0 items=0 ppid=3840 pid=4086 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:22.861000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6334613562383039623236383431633934356339303039346435646630 Jan 14 01:04:22.879674 kernel: audit: type=1327 audit(1768352662.861:579): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6334613562383039623236383431633934356339303039346435646630 Jan 14 01:04:22.861000 audit: BPF prog-id=174 op=UNLOAD Jan 14 01:04:22.882456 kernel: audit: type=1334 audit(1768352662.861:580): prog-id=174 op=UNLOAD Jan 14 01:04:22.861000 audit[4086]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3840 pid=4086 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:22.861000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6334613562383039623236383431633934356339303039346435646630 Jan 14 01:04:22.888344 kernel: audit: type=1300 audit(1768352662.861:580): arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3840 pid=4086 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:22.888393 kernel: audit: type=1327 audit(1768352662.861:580): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6334613562383039623236383431633934356339303039346435646630 Jan 14 01:04:22.861000 audit: BPF prog-id=173 op=UNLOAD Jan 14 01:04:22.891594 kernel: audit: type=1334 audit(1768352662.861:581): prog-id=173 op=UNLOAD Jan 14 01:04:22.861000 audit[4086]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3840 pid=4086 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:22.861000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6334613562383039623236383431633934356339303039346435646630 Jan 14 01:04:22.861000 audit: BPF prog-id=175 op=LOAD Jan 14 01:04:22.861000 audit[4086]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001a06e8 a2=98 a3=0 items=0 ppid=3840 pid=4086 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:22.861000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6334613562383039623236383431633934356339303039346435646630 Jan 14 01:04:22.900948 containerd[1695]: time="2026-01-14T01:04:22.900912811Z" level=info msg="StartContainer for \"c4a5b809b26841c945c90094d5df088b3b7f98b6736cdbc81c7840d4cc9cc38a\" returns successfully" Jan 14 01:04:24.201592 containerd[1695]: time="2026-01-14T01:04:24.201554306Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jan 14 01:04:24.203529 systemd[1]: cri-containerd-c4a5b809b26841c945c90094d5df088b3b7f98b6736cdbc81c7840d4cc9cc38a.scope: Deactivated successfully. Jan 14 01:04:24.203909 systemd[1]: cri-containerd-c4a5b809b26841c945c90094d5df088b3b7f98b6736cdbc81c7840d4cc9cc38a.scope: Consumed 469ms CPU time, 194.8M memory peak, 171.3M written to disk. Jan 14 01:04:24.206301 containerd[1695]: time="2026-01-14T01:04:24.206197373Z" level=info msg="received container exit event container_id:\"c4a5b809b26841c945c90094d5df088b3b7f98b6736cdbc81c7840d4cc9cc38a\" id:\"c4a5b809b26841c945c90094d5df088b3b7f98b6736cdbc81c7840d4cc9cc38a\" pid:4098 exited_at:{seconds:1768352664 nanos:205893651}" Jan 14 01:04:24.206000 audit: BPF prog-id=175 op=UNLOAD Jan 14 01:04:24.226949 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-c4a5b809b26841c945c90094d5df088b3b7f98b6736cdbc81c7840d4cc9cc38a-rootfs.mount: Deactivated successfully. Jan 14 01:04:24.291802 kubelet[3316]: I0114 01:04:24.291784 3316 kubelet_node_status.go:439] "Fast updating node status as it just became ready" Jan 14 01:04:24.470924 systemd[1]: Created slice kubepods-besteffort-pod5c41a377_2ae4_44cc_bcba_d2dc8d1e9391.slice - libcontainer container kubepods-besteffort-pod5c41a377_2ae4_44cc_bcba_d2dc8d1e9391.slice. Jan 14 01:04:24.533148 kubelet[3316]: I0114 01:04:24.533037 3316 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmssj\" (UniqueName: \"kubernetes.io/projected/5c41a377-2ae4-44cc-bcba-d2dc8d1e9391-kube-api-access-zmssj\") pod \"whisker-676969d5fd-nlldg\" (UID: \"5c41a377-2ae4-44cc-bcba-d2dc8d1e9391\") " pod="calico-system/whisker-676969d5fd-nlldg" Jan 14 01:04:24.533148 kubelet[3316]: I0114 01:04:24.533082 3316 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/5c41a377-2ae4-44cc-bcba-d2dc8d1e9391-whisker-backend-key-pair\") pod \"whisker-676969d5fd-nlldg\" (UID: \"5c41a377-2ae4-44cc-bcba-d2dc8d1e9391\") " pod="calico-system/whisker-676969d5fd-nlldg" Jan 14 01:04:24.533148 kubelet[3316]: I0114 01:04:24.533109 3316 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c41a377-2ae4-44cc-bcba-d2dc8d1e9391-whisker-ca-bundle\") pod \"whisker-676969d5fd-nlldg\" (UID: \"5c41a377-2ae4-44cc-bcba-d2dc8d1e9391\") " pod="calico-system/whisker-676969d5fd-nlldg" Jan 14 01:04:25.318295 systemd[1]: Created slice kubepods-besteffort-pod81b57aeb_9645_4cab_a7a2_931a98fd5ce6.slice - libcontainer container kubepods-besteffort-pod81b57aeb_9645_4cab_a7a2_931a98fd5ce6.slice. Jan 14 01:04:25.337344 kubelet[3316]: I0114 01:04:25.337256 3316 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/81b57aeb-9645-4cab-a7a2-931a98fd5ce6-tigera-ca-bundle\") pod \"calico-kube-controllers-86656bdf75-c9kjr\" (UID: \"81b57aeb-9645-4cab-a7a2-931a98fd5ce6\") " pod="calico-system/calico-kube-controllers-86656bdf75-c9kjr" Jan 14 01:04:25.337344 kubelet[3316]: I0114 01:04:25.337302 3316 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hggxr\" (UniqueName: \"kubernetes.io/projected/81b57aeb-9645-4cab-a7a2-931a98fd5ce6-kube-api-access-hggxr\") pod \"calico-kube-controllers-86656bdf75-c9kjr\" (UID: \"81b57aeb-9645-4cab-a7a2-931a98fd5ce6\") " pod="calico-system/calico-kube-controllers-86656bdf75-c9kjr" Jan 14 01:04:25.635232 containerd[1695]: time="2026-01-14T01:04:25.635202824Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-676969d5fd-nlldg,Uid:5c41a377-2ae4-44cc-bcba-d2dc8d1e9391,Namespace:calico-system,Attempt:0,}" Jan 14 01:04:25.719225 containerd[1695]: time="2026-01-14T01:04:25.719029873Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-86656bdf75-c9kjr,Uid:81b57aeb-9645-4cab-a7a2-931a98fd5ce6,Namespace:calico-system,Attempt:0,}" Jan 14 01:04:25.742795 kubelet[3316]: I0114 01:04:25.740226 3316 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/b7d50268-8797-458d-a912-f7456846c1f2-calico-apiserver-certs\") pod \"calico-apiserver-694bbc95d6-42t9l\" (UID: \"b7d50268-8797-458d-a912-f7456846c1f2\") " pod="calico-apiserver/calico-apiserver-694bbc95d6-42t9l" Jan 14 01:04:25.742795 kubelet[3316]: I0114 01:04:25.740636 3316 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/f26be41d-3305-4b21-9d76-bde121cc2cce-calico-apiserver-certs\") pod \"calico-apiserver-694bbc95d6-tk96l\" (UID: \"f26be41d-3305-4b21-9d76-bde121cc2cce\") " pod="calico-apiserver/calico-apiserver-694bbc95d6-tk96l" Jan 14 01:04:25.742795 kubelet[3316]: I0114 01:04:25.740656 3316 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxdc8\" (UniqueName: \"kubernetes.io/projected/f26be41d-3305-4b21-9d76-bde121cc2cce-kube-api-access-bxdc8\") pod \"calico-apiserver-694bbc95d6-tk96l\" (UID: \"f26be41d-3305-4b21-9d76-bde121cc2cce\") " pod="calico-apiserver/calico-apiserver-694bbc95d6-tk96l" Jan 14 01:04:25.742795 kubelet[3316]: I0114 01:04:25.740673 3316 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7f235fe5-2cf3-4556-b562-f0359308c37e-config-volume\") pod \"coredns-66bc5c9577-c82bn\" (UID: \"7f235fe5-2cf3-4556-b562-f0359308c37e\") " pod="kube-system/coredns-66bc5c9577-c82bn" Jan 14 01:04:25.742795 kubelet[3316]: I0114 01:04:25.740688 3316 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hnm9\" (UniqueName: \"kubernetes.io/projected/6dd4e5be-52fe-4768-ad9c-ffbe85d93102-kube-api-access-6hnm9\") pod \"coredns-66bc5c9577-d9j8s\" (UID: \"6dd4e5be-52fe-4768-ad9c-ffbe85d93102\") " pod="kube-system/coredns-66bc5c9577-d9j8s" Jan 14 01:04:25.742989 kubelet[3316]: I0114 01:04:25.740707 3316 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/95ab78ae-ca97-4cab-9490-03b0a50f740c-goldmane-key-pair\") pod \"goldmane-7c778bb748-vc7sq\" (UID: \"95ab78ae-ca97-4cab-9490-03b0a50f740c\") " pod="calico-system/goldmane-7c778bb748-vc7sq" Jan 14 01:04:25.742989 kubelet[3316]: I0114 01:04:25.740721 3316 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksnpp\" (UniqueName: \"kubernetes.io/projected/7f235fe5-2cf3-4556-b562-f0359308c37e-kube-api-access-ksnpp\") pod \"coredns-66bc5c9577-c82bn\" (UID: \"7f235fe5-2cf3-4556-b562-f0359308c37e\") " pod="kube-system/coredns-66bc5c9577-c82bn" Jan 14 01:04:25.742989 kubelet[3316]: I0114 01:04:25.740737 3316 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6dd4e5be-52fe-4768-ad9c-ffbe85d93102-config-volume\") pod \"coredns-66bc5c9577-d9j8s\" (UID: \"6dd4e5be-52fe-4768-ad9c-ffbe85d93102\") " pod="kube-system/coredns-66bc5c9577-d9j8s" Jan 14 01:04:25.742989 kubelet[3316]: I0114 01:04:25.740752 3316 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzwr6\" (UniqueName: \"kubernetes.io/projected/b7d50268-8797-458d-a912-f7456846c1f2-kube-api-access-nzwr6\") pod \"calico-apiserver-694bbc95d6-42t9l\" (UID: \"b7d50268-8797-458d-a912-f7456846c1f2\") " pod="calico-apiserver/calico-apiserver-694bbc95d6-42t9l" Jan 14 01:04:25.742989 kubelet[3316]: I0114 01:04:25.740801 3316 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbj8r\" (UniqueName: \"kubernetes.io/projected/95ab78ae-ca97-4cab-9490-03b0a50f740c-kube-api-access-sbj8r\") pod \"goldmane-7c778bb748-vc7sq\" (UID: \"95ab78ae-ca97-4cab-9490-03b0a50f740c\") " pod="calico-system/goldmane-7c778bb748-vc7sq" Jan 14 01:04:25.743106 kubelet[3316]: I0114 01:04:25.740820 3316 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95ab78ae-ca97-4cab-9490-03b0a50f740c-config\") pod \"goldmane-7c778bb748-vc7sq\" (UID: \"95ab78ae-ca97-4cab-9490-03b0a50f740c\") " pod="calico-system/goldmane-7c778bb748-vc7sq" Jan 14 01:04:25.743106 kubelet[3316]: I0114 01:04:25.740834 3316 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95ab78ae-ca97-4cab-9490-03b0a50f740c-goldmane-ca-bundle\") pod \"goldmane-7c778bb748-vc7sq\" (UID: \"95ab78ae-ca97-4cab-9490-03b0a50f740c\") " pod="calico-system/goldmane-7c778bb748-vc7sq" Jan 14 01:04:25.744923 systemd[1]: Created slice kubepods-besteffort-podb7d50268_8797_458d_a912_f7456846c1f2.slice - libcontainer container kubepods-besteffort-podb7d50268_8797_458d_a912_f7456846c1f2.slice. Jan 14 01:04:25.758527 systemd[1]: Created slice kubepods-besteffort-pod4395ad87_346f_47f3_8e06_f63944f13a5d.slice - libcontainer container kubepods-besteffort-pod4395ad87_346f_47f3_8e06_f63944f13a5d.slice. Jan 14 01:04:25.766167 containerd[1695]: time="2026-01-14T01:04:25.766127883Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-gdh9l,Uid:4395ad87-346f-47f3-8e06-f63944f13a5d,Namespace:calico-system,Attempt:0,}" Jan 14 01:04:25.770213 systemd[1]: Created slice kubepods-besteffort-podf26be41d_3305_4b21_9d76_bde121cc2cce.slice - libcontainer container kubepods-besteffort-podf26be41d_3305_4b21_9d76_bde121cc2cce.slice. Jan 14 01:04:25.785064 systemd[1]: Created slice kubepods-besteffort-pod95ab78ae_ca97_4cab_9490_03b0a50f740c.slice - libcontainer container kubepods-besteffort-pod95ab78ae_ca97_4cab_9490_03b0a50f740c.slice. Jan 14 01:04:25.797619 systemd[1]: Created slice kubepods-burstable-pod6dd4e5be_52fe_4768_ad9c_ffbe85d93102.slice - libcontainer container kubepods-burstable-pod6dd4e5be_52fe_4768_ad9c_ffbe85d93102.slice. Jan 14 01:04:25.807055 systemd[1]: Created slice kubepods-burstable-pod7f235fe5_2cf3_4556_b562_f0359308c37e.slice - libcontainer container kubepods-burstable-pod7f235fe5_2cf3_4556_b562_f0359308c37e.slice. Jan 14 01:04:25.890274 containerd[1695]: time="2026-01-14T01:04:25.890109074Z" level=error msg="Failed to destroy network for sandbox \"adafaa7a2c68d440909746f0bdb4c80e35e09da6e3173397afe8f51f45fca7a1\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:04:25.895556 containerd[1695]: time="2026-01-14T01:04:25.893745041Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-676969d5fd-nlldg,Uid:5c41a377-2ae4-44cc-bcba-d2dc8d1e9391,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"adafaa7a2c68d440909746f0bdb4c80e35e09da6e3173397afe8f51f45fca7a1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:04:25.896027 kubelet[3316]: E0114 01:04:25.895924 3316 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"adafaa7a2c68d440909746f0bdb4c80e35e09da6e3173397afe8f51f45fca7a1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:04:25.896027 kubelet[3316]: E0114 01:04:25.896009 3316 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"adafaa7a2c68d440909746f0bdb4c80e35e09da6e3173397afe8f51f45fca7a1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-676969d5fd-nlldg" Jan 14 01:04:25.896138 kubelet[3316]: E0114 01:04:25.896036 3316 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"adafaa7a2c68d440909746f0bdb4c80e35e09da6e3173397afe8f51f45fca7a1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-676969d5fd-nlldg" Jan 14 01:04:25.896913 kubelet[3316]: E0114 01:04:25.896270 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-676969d5fd-nlldg_calico-system(5c41a377-2ae4-44cc-bcba-d2dc8d1e9391)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-676969d5fd-nlldg_calico-system(5c41a377-2ae4-44cc-bcba-d2dc8d1e9391)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"adafaa7a2c68d440909746f0bdb4c80e35e09da6e3173397afe8f51f45fca7a1\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-676969d5fd-nlldg" podUID="5c41a377-2ae4-44cc-bcba-d2dc8d1e9391" Jan 14 01:04:25.905500 containerd[1695]: time="2026-01-14T01:04:25.905396798Z" level=error msg="Failed to destroy network for sandbox \"d1a2432d1d0ff24c5ccb632e913f31824beac2776016b8132db73f443c9f8631\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:04:25.908605 containerd[1695]: time="2026-01-14T01:04:25.908568010Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-gdh9l,Uid:4395ad87-346f-47f3-8e06-f63944f13a5d,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"d1a2432d1d0ff24c5ccb632e913f31824beac2776016b8132db73f443c9f8631\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:04:25.908930 kubelet[3316]: E0114 01:04:25.908902 3316 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d1a2432d1d0ff24c5ccb632e913f31824beac2776016b8132db73f443c9f8631\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:04:25.908985 kubelet[3316]: E0114 01:04:25.908948 3316 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d1a2432d1d0ff24c5ccb632e913f31824beac2776016b8132db73f443c9f8631\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-gdh9l" Jan 14 01:04:25.908985 kubelet[3316]: E0114 01:04:25.908973 3316 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d1a2432d1d0ff24c5ccb632e913f31824beac2776016b8132db73f443c9f8631\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-gdh9l" Jan 14 01:04:25.909039 kubelet[3316]: E0114 01:04:25.909021 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-gdh9l_calico-system(4395ad87-346f-47f3-8e06-f63944f13a5d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-gdh9l_calico-system(4395ad87-346f-47f3-8e06-f63944f13a5d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d1a2432d1d0ff24c5ccb632e913f31824beac2776016b8132db73f443c9f8631\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-gdh9l" podUID="4395ad87-346f-47f3-8e06-f63944f13a5d" Jan 14 01:04:25.915114 containerd[1695]: time="2026-01-14T01:04:25.915078451Z" level=error msg="Failed to destroy network for sandbox \"7ff26554cdebe178351f21189e49d09981214154144bfdecdb956040f1a74449\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:04:25.918312 containerd[1695]: time="2026-01-14T01:04:25.918275516Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-86656bdf75-c9kjr,Uid:81b57aeb-9645-4cab-a7a2-931a98fd5ce6,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"7ff26554cdebe178351f21189e49d09981214154144bfdecdb956040f1a74449\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:04:25.918794 kubelet[3316]: E0114 01:04:25.918485 3316 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7ff26554cdebe178351f21189e49d09981214154144bfdecdb956040f1a74449\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:04:25.918794 kubelet[3316]: E0114 01:04:25.918526 3316 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7ff26554cdebe178351f21189e49d09981214154144bfdecdb956040f1a74449\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-86656bdf75-c9kjr" Jan 14 01:04:25.918794 kubelet[3316]: E0114 01:04:25.918546 3316 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7ff26554cdebe178351f21189e49d09981214154144bfdecdb956040f1a74449\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-86656bdf75-c9kjr" Jan 14 01:04:25.918899 kubelet[3316]: E0114 01:04:25.918591 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-86656bdf75-c9kjr_calico-system(81b57aeb-9645-4cab-a7a2-931a98fd5ce6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-86656bdf75-c9kjr_calico-system(81b57aeb-9645-4cab-a7a2-931a98fd5ce6)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7ff26554cdebe178351f21189e49d09981214154144bfdecdb956040f1a74449\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-86656bdf75-c9kjr" podUID="81b57aeb-9645-4cab-a7a2-931a98fd5ce6" Jan 14 01:04:25.962930 containerd[1695]: time="2026-01-14T01:04:25.962899192Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\"" Jan 14 01:04:26.056683 containerd[1695]: time="2026-01-14T01:04:26.056559474Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-694bbc95d6-42t9l,Uid:b7d50268-8797-458d-a912-f7456846c1f2,Namespace:calico-apiserver,Attempt:0,}" Jan 14 01:04:26.079373 containerd[1695]: time="2026-01-14T01:04:26.079265717Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-694bbc95d6-tk96l,Uid:f26be41d-3305-4b21-9d76-bde121cc2cce,Namespace:calico-apiserver,Attempt:0,}" Jan 14 01:04:26.094965 containerd[1695]: time="2026-01-14T01:04:26.094928054Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7c778bb748-vc7sq,Uid:95ab78ae-ca97-4cab-9490-03b0a50f740c,Namespace:calico-system,Attempt:0,}" Jan 14 01:04:26.107280 containerd[1695]: time="2026-01-14T01:04:26.107218220Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-d9j8s,Uid:6dd4e5be-52fe-4768-ad9c-ffbe85d93102,Namespace:kube-system,Attempt:0,}" Jan 14 01:04:26.112063 containerd[1695]: time="2026-01-14T01:04:26.112029325Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-c82bn,Uid:7f235fe5-2cf3-4556-b562-f0359308c37e,Namespace:kube-system,Attempt:0,}" Jan 14 01:04:26.121315 containerd[1695]: time="2026-01-14T01:04:26.121228706Z" level=error msg="Failed to destroy network for sandbox \"fbb4ac5833c0daf6ffb2779efc2a6ff2cd980345c87e6df6f47ba4ec1391fc5b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:04:26.129088 containerd[1695]: time="2026-01-14T01:04:26.129036946Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-694bbc95d6-42t9l,Uid:b7d50268-8797-458d-a912-f7456846c1f2,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"fbb4ac5833c0daf6ffb2779efc2a6ff2cd980345c87e6df6f47ba4ec1391fc5b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:04:26.129440 kubelet[3316]: E0114 01:04:26.129408 3316 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fbb4ac5833c0daf6ffb2779efc2a6ff2cd980345c87e6df6f47ba4ec1391fc5b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:04:26.129496 kubelet[3316]: E0114 01:04:26.129456 3316 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fbb4ac5833c0daf6ffb2779efc2a6ff2cd980345c87e6df6f47ba4ec1391fc5b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-694bbc95d6-42t9l" Jan 14 01:04:26.129496 kubelet[3316]: E0114 01:04:26.129481 3316 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fbb4ac5833c0daf6ffb2779efc2a6ff2cd980345c87e6df6f47ba4ec1391fc5b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-694bbc95d6-42t9l" Jan 14 01:04:26.129559 kubelet[3316]: E0114 01:04:26.129526 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-694bbc95d6-42t9l_calico-apiserver(b7d50268-8797-458d-a912-f7456846c1f2)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-694bbc95d6-42t9l_calico-apiserver(b7d50268-8797-458d-a912-f7456846c1f2)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"fbb4ac5833c0daf6ffb2779efc2a6ff2cd980345c87e6df6f47ba4ec1391fc5b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-694bbc95d6-42t9l" podUID="b7d50268-8797-458d-a912-f7456846c1f2" Jan 14 01:04:26.148849 containerd[1695]: time="2026-01-14T01:04:26.148497884Z" level=error msg="Failed to destroy network for sandbox \"4649062604bc5b2fdc02778a8dea985f7f0f992b33418161edbe24bba1c0f8fd\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:04:26.155219 containerd[1695]: time="2026-01-14T01:04:26.155071881Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-694bbc95d6-tk96l,Uid:f26be41d-3305-4b21-9d76-bde121cc2cce,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"4649062604bc5b2fdc02778a8dea985f7f0f992b33418161edbe24bba1c0f8fd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:04:26.155359 kubelet[3316]: E0114 01:04:26.155276 3316 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4649062604bc5b2fdc02778a8dea985f7f0f992b33418161edbe24bba1c0f8fd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:04:26.155359 kubelet[3316]: E0114 01:04:26.155318 3316 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4649062604bc5b2fdc02778a8dea985f7f0f992b33418161edbe24bba1c0f8fd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-694bbc95d6-tk96l" Jan 14 01:04:26.155359 kubelet[3316]: E0114 01:04:26.155335 3316 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4649062604bc5b2fdc02778a8dea985f7f0f992b33418161edbe24bba1c0f8fd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-694bbc95d6-tk96l" Jan 14 01:04:26.155447 kubelet[3316]: E0114 01:04:26.155382 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-694bbc95d6-tk96l_calico-apiserver(f26be41d-3305-4b21-9d76-bde121cc2cce)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-694bbc95d6-tk96l_calico-apiserver(f26be41d-3305-4b21-9d76-bde121cc2cce)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4649062604bc5b2fdc02778a8dea985f7f0f992b33418161edbe24bba1c0f8fd\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-694bbc95d6-tk96l" podUID="f26be41d-3305-4b21-9d76-bde121cc2cce" Jan 14 01:04:26.200860 containerd[1695]: time="2026-01-14T01:04:26.200744288Z" level=error msg="Failed to destroy network for sandbox \"f9c1021dc55631b322f97ad4c7277d0a018c29343dc3d91039c159a35991787f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:04:26.202769 containerd[1695]: time="2026-01-14T01:04:26.202740552Z" level=error msg="Failed to destroy network for sandbox \"d872a9b4d06400e1c728e332b9a38c60d2a2869c126e4700ad2331dbc71feeb5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:04:26.205417 containerd[1695]: time="2026-01-14T01:04:26.205381656Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-d9j8s,Uid:6dd4e5be-52fe-4768-ad9c-ffbe85d93102,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f9c1021dc55631b322f97ad4c7277d0a018c29343dc3d91039c159a35991787f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:04:26.205792 kubelet[3316]: E0114 01:04:26.205754 3316 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f9c1021dc55631b322f97ad4c7277d0a018c29343dc3d91039c159a35991787f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:04:26.205837 kubelet[3316]: E0114 01:04:26.205818 3316 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f9c1021dc55631b322f97ad4c7277d0a018c29343dc3d91039c159a35991787f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-d9j8s" Jan 14 01:04:26.205863 kubelet[3316]: E0114 01:04:26.205839 3316 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f9c1021dc55631b322f97ad4c7277d0a018c29343dc3d91039c159a35991787f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-d9j8s" Jan 14 01:04:26.206014 kubelet[3316]: E0114 01:04:26.205986 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-66bc5c9577-d9j8s_kube-system(6dd4e5be-52fe-4768-ad9c-ffbe85d93102)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-66bc5c9577-d9j8s_kube-system(6dd4e5be-52fe-4768-ad9c-ffbe85d93102)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f9c1021dc55631b322f97ad4c7277d0a018c29343dc3d91039c159a35991787f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-66bc5c9577-d9j8s" podUID="6dd4e5be-52fe-4768-ad9c-ffbe85d93102" Jan 14 01:04:26.209366 containerd[1695]: time="2026-01-14T01:04:26.209267539Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-c82bn,Uid:7f235fe5-2cf3-4556-b562-f0359308c37e,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"d872a9b4d06400e1c728e332b9a38c60d2a2869c126e4700ad2331dbc71feeb5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:04:26.209925 kubelet[3316]: E0114 01:04:26.209782 3316 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d872a9b4d06400e1c728e332b9a38c60d2a2869c126e4700ad2331dbc71feeb5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:04:26.209925 kubelet[3316]: E0114 01:04:26.209833 3316 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d872a9b4d06400e1c728e332b9a38c60d2a2869c126e4700ad2331dbc71feeb5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-c82bn" Jan 14 01:04:26.209925 kubelet[3316]: E0114 01:04:26.209868 3316 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d872a9b4d06400e1c728e332b9a38c60d2a2869c126e4700ad2331dbc71feeb5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-c82bn" Jan 14 01:04:26.210132 kubelet[3316]: E0114 01:04:26.209909 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-66bc5c9577-c82bn_kube-system(7f235fe5-2cf3-4556-b562-f0359308c37e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-66bc5c9577-c82bn_kube-system(7f235fe5-2cf3-4556-b562-f0359308c37e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d872a9b4d06400e1c728e332b9a38c60d2a2869c126e4700ad2331dbc71feeb5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-66bc5c9577-c82bn" podUID="7f235fe5-2cf3-4556-b562-f0359308c37e" Jan 14 01:04:26.216789 containerd[1695]: time="2026-01-14T01:04:26.216759800Z" level=error msg="Failed to destroy network for sandbox \"47cd1a1c1064f14e45e5130e0fe3f2677e064a5da943136f0467b31e1f73b92b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:04:26.220222 containerd[1695]: time="2026-01-14T01:04:26.220187966Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7c778bb748-vc7sq,Uid:95ab78ae-ca97-4cab-9490-03b0a50f740c,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"47cd1a1c1064f14e45e5130e0fe3f2677e064a5da943136f0467b31e1f73b92b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:04:26.220407 kubelet[3316]: E0114 01:04:26.220351 3316 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"47cd1a1c1064f14e45e5130e0fe3f2677e064a5da943136f0467b31e1f73b92b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:04:26.220407 kubelet[3316]: E0114 01:04:26.220391 3316 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"47cd1a1c1064f14e45e5130e0fe3f2677e064a5da943136f0467b31e1f73b92b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7c778bb748-vc7sq" Jan 14 01:04:26.220493 kubelet[3316]: E0114 01:04:26.220409 3316 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"47cd1a1c1064f14e45e5130e0fe3f2677e064a5da943136f0467b31e1f73b92b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7c778bb748-vc7sq" Jan 14 01:04:26.220516 kubelet[3316]: E0114 01:04:26.220485 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-7c778bb748-vc7sq_calico-system(95ab78ae-ca97-4cab-9490-03b0a50f740c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-7c778bb748-vc7sq_calico-system(95ab78ae-ca97-4cab-9490-03b0a50f740c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"47cd1a1c1064f14e45e5130e0fe3f2677e064a5da943136f0467b31e1f73b92b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-7c778bb748-vc7sq" podUID="95ab78ae-ca97-4cab-9490-03b0a50f740c" Jan 14 01:04:26.452581 systemd[1]: run-netns-cni\x2d72bbb335\x2d950c\x2df3fd\x2d7426\x2dfb1fcce25607.mount: Deactivated successfully. Jan 14 01:04:26.452978 systemd[1]: run-netns-cni\x2d6ad92fea\x2d2d3c\x2d01f7\x2dd675\x2d4695f109704c.mount: Deactivated successfully. Jan 14 01:04:26.453245 systemd[1]: run-netns-cni\x2dac1ef4e8\x2d3586\x2d69be\x2de2a0\x2d9de2200db627.mount: Deactivated successfully. Jan 14 01:04:30.621179 kubelet[3316]: I0114 01:04:30.620309 3316 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 14 01:04:30.657866 kernel: kauditd_printk_skb: 6 callbacks suppressed Jan 14 01:04:30.657988 kernel: audit: type=1325 audit(1768352670.653:584): table=filter:115 family=2 entries=21 op=nft_register_rule pid=4350 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:04:30.653000 audit[4350]: NETFILTER_CFG table=filter:115 family=2 entries=21 op=nft_register_rule pid=4350 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:04:30.653000 audit[4350]: SYSCALL arch=c000003e syscall=46 success=yes exit=7480 a0=3 a1=7ffedf3dc600 a2=0 a3=7ffedf3dc5ec items=0 ppid=3467 pid=4350 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:30.665067 kernel: audit: type=1300 audit(1768352670.653:584): arch=c000003e syscall=46 success=yes exit=7480 a0=3 a1=7ffedf3dc600 a2=0 a3=7ffedf3dc5ec items=0 ppid=3467 pid=4350 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:30.653000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:04:30.659000 audit[4350]: NETFILTER_CFG table=nat:116 family=2 entries=19 op=nft_register_chain pid=4350 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:04:30.668618 kernel: audit: type=1327 audit(1768352670.653:584): proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:04:30.668656 kernel: audit: type=1325 audit(1768352670.659:585): table=nat:116 family=2 entries=19 op=nft_register_chain pid=4350 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:04:30.659000 audit[4350]: SYSCALL arch=c000003e syscall=46 success=yes exit=6276 a0=3 a1=7ffedf3dc600 a2=0 a3=7ffedf3dc5ec items=0 ppid=3467 pid=4350 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:30.671936 kernel: audit: type=1300 audit(1768352670.659:585): arch=c000003e syscall=46 success=yes exit=6276 a0=3 a1=7ffedf3dc600 a2=0 a3=7ffedf3dc5ec items=0 ppid=3467 pid=4350 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:30.659000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:04:30.675384 kernel: audit: type=1327 audit(1768352670.659:585): proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:04:33.680917 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3295457582.mount: Deactivated successfully. Jan 14 01:04:33.711098 containerd[1695]: time="2026-01-14T01:04:33.710557350Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:04:33.712180 containerd[1695]: time="2026-01-14T01:04:33.712157620Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.4: active requests=0, bytes read=156880025" Jan 14 01:04:33.713989 containerd[1695]: time="2026-01-14T01:04:33.713959652Z" level=info msg="ImageCreate event name:\"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:04:33.717005 containerd[1695]: time="2026-01-14T01:04:33.716978384Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:04:33.717373 containerd[1695]: time="2026-01-14T01:04:33.717311836Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.4\" with image id \"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\", size \"156883537\" in 7.754377557s" Jan 14 01:04:33.717411 containerd[1695]: time="2026-01-14T01:04:33.717381284Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\" returns image reference \"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\"" Jan 14 01:04:33.747473 containerd[1695]: time="2026-01-14T01:04:33.747437047Z" level=info msg="CreateContainer within sandbox \"b08cd74b7cd1e5e9af53f88d44fdfd4a375fd24d3f7460322c2d5f450bf0a431\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Jan 14 01:04:33.760753 containerd[1695]: time="2026-01-14T01:04:33.760636348Z" level=info msg="Container 184f17c5c82331a1ebb3e9ae10d0457db589802799c52461d032e47c08b25968: CDI devices from CRI Config.CDIDevices: []" Jan 14 01:04:33.772146 containerd[1695]: time="2026-01-14T01:04:33.772017288Z" level=info msg="CreateContainer within sandbox \"b08cd74b7cd1e5e9af53f88d44fdfd4a375fd24d3f7460322c2d5f450bf0a431\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"184f17c5c82331a1ebb3e9ae10d0457db589802799c52461d032e47c08b25968\"" Jan 14 01:04:33.772807 containerd[1695]: time="2026-01-14T01:04:33.772631767Z" level=info msg="StartContainer for \"184f17c5c82331a1ebb3e9ae10d0457db589802799c52461d032e47c08b25968\"" Jan 14 01:04:33.774179 containerd[1695]: time="2026-01-14T01:04:33.774124269Z" level=info msg="connecting to shim 184f17c5c82331a1ebb3e9ae10d0457db589802799c52461d032e47c08b25968" address="unix:///run/containerd/s/c4d74ac0321e631164279ed19b87f243676783e48bbefab59d82e4fcba016a17" protocol=ttrpc version=3 Jan 14 01:04:33.828257 systemd[1]: Started cri-containerd-184f17c5c82331a1ebb3e9ae10d0457db589802799c52461d032e47c08b25968.scope - libcontainer container 184f17c5c82331a1ebb3e9ae10d0457db589802799c52461d032e47c08b25968. Jan 14 01:04:33.881000 audit: BPF prog-id=176 op=LOAD Jan 14 01:04:33.881000 audit[4359]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=3840 pid=4359 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:33.886077 kernel: audit: type=1334 audit(1768352673.881:586): prog-id=176 op=LOAD Jan 14 01:04:33.886133 kernel: audit: type=1300 audit(1768352673.881:586): arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=3840 pid=4359 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:33.881000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3138346631376335633832333331613165626233653961653130643034 Jan 14 01:04:33.890553 kernel: audit: type=1327 audit(1768352673.881:586): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3138346631376335633832333331613165626233653961653130643034 Jan 14 01:04:33.881000 audit: BPF prog-id=177 op=LOAD Jan 14 01:04:33.893650 kernel: audit: type=1334 audit(1768352673.881:587): prog-id=177 op=LOAD Jan 14 01:04:33.881000 audit[4359]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=3840 pid=4359 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:33.881000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3138346631376335633832333331613165626233653961653130643034 Jan 14 01:04:33.881000 audit: BPF prog-id=177 op=UNLOAD Jan 14 01:04:33.881000 audit[4359]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3840 pid=4359 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:33.881000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3138346631376335633832333331613165626233653961653130643034 Jan 14 01:04:33.881000 audit: BPF prog-id=176 op=UNLOAD Jan 14 01:04:33.881000 audit[4359]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3840 pid=4359 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:33.881000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3138346631376335633832333331613165626233653961653130643034 Jan 14 01:04:33.881000 audit: BPF prog-id=178 op=LOAD Jan 14 01:04:33.881000 audit[4359]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=3840 pid=4359 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:33.881000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3138346631376335633832333331613165626233653961653130643034 Jan 14 01:04:33.913575 containerd[1695]: time="2026-01-14T01:04:33.913538570Z" level=info msg="StartContainer for \"184f17c5c82331a1ebb3e9ae10d0457db589802799c52461d032e47c08b25968\" returns successfully" Jan 14 01:04:34.011196 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Jan 14 01:04:34.011300 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Jan 14 01:04:34.013267 kubelet[3316]: I0114 01:04:34.013219 3316 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-mw7w8" podStartSLOduration=1.543357765 podStartE2EDuration="22.013205398s" podCreationTimestamp="2026-01-14 01:04:12 +0000 UTC" firstStartedPulling="2026-01-14 01:04:13.256876568 +0000 UTC m=+19.508539854" lastFinishedPulling="2026-01-14 01:04:33.726724203 +0000 UTC m=+39.978387487" observedRunningTime="2026-01-14 01:04:34.012915355 +0000 UTC m=+40.264578662" watchObservedRunningTime="2026-01-14 01:04:34.013205398 +0000 UTC m=+40.264868706" Jan 14 01:04:34.191221 kubelet[3316]: I0114 01:04:34.191185 3316 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmssj\" (UniqueName: \"kubernetes.io/projected/5c41a377-2ae4-44cc-bcba-d2dc8d1e9391-kube-api-access-zmssj\") pod \"5c41a377-2ae4-44cc-bcba-d2dc8d1e9391\" (UID: \"5c41a377-2ae4-44cc-bcba-d2dc8d1e9391\") " Jan 14 01:04:34.191583 kubelet[3316]: I0114 01:04:34.191529 3316 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/5c41a377-2ae4-44cc-bcba-d2dc8d1e9391-whisker-backend-key-pair\") pod \"5c41a377-2ae4-44cc-bcba-d2dc8d1e9391\" (UID: \"5c41a377-2ae4-44cc-bcba-d2dc8d1e9391\") " Jan 14 01:04:34.191583 kubelet[3316]: I0114 01:04:34.191560 3316 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c41a377-2ae4-44cc-bcba-d2dc8d1e9391-whisker-ca-bundle\") pod \"5c41a377-2ae4-44cc-bcba-d2dc8d1e9391\" (UID: \"5c41a377-2ae4-44cc-bcba-d2dc8d1e9391\") " Jan 14 01:04:34.196605 kubelet[3316]: I0114 01:04:34.195550 3316 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c41a377-2ae4-44cc-bcba-d2dc8d1e9391-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "5c41a377-2ae4-44cc-bcba-d2dc8d1e9391" (UID: "5c41a377-2ae4-44cc-bcba-d2dc8d1e9391"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Jan 14 01:04:34.197173 kubelet[3316]: I0114 01:04:34.197138 3316 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c41a377-2ae4-44cc-bcba-d2dc8d1e9391-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "5c41a377-2ae4-44cc-bcba-d2dc8d1e9391" (UID: "5c41a377-2ae4-44cc-bcba-d2dc8d1e9391"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Jan 14 01:04:34.198138 kubelet[3316]: I0114 01:04:34.198118 3316 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c41a377-2ae4-44cc-bcba-d2dc8d1e9391-kube-api-access-zmssj" (OuterVolumeSpecName: "kube-api-access-zmssj") pod "5c41a377-2ae4-44cc-bcba-d2dc8d1e9391" (UID: "5c41a377-2ae4-44cc-bcba-d2dc8d1e9391"). InnerVolumeSpecName "kube-api-access-zmssj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Jan 14 01:04:34.292397 kubelet[3316]: I0114 01:04:34.292297 3316 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c41a377-2ae4-44cc-bcba-d2dc8d1e9391-whisker-ca-bundle\") on node \"ci-4547-0-0-n-de0c74fc75\" DevicePath \"\"" Jan 14 01:04:34.292397 kubelet[3316]: I0114 01:04:34.292323 3316 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zmssj\" (UniqueName: \"kubernetes.io/projected/5c41a377-2ae4-44cc-bcba-d2dc8d1e9391-kube-api-access-zmssj\") on node \"ci-4547-0-0-n-de0c74fc75\" DevicePath \"\"" Jan 14 01:04:34.292397 kubelet[3316]: I0114 01:04:34.292332 3316 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/5c41a377-2ae4-44cc-bcba-d2dc8d1e9391-whisker-backend-key-pair\") on node \"ci-4547-0-0-n-de0c74fc75\" DevicePath \"\"" Jan 14 01:04:34.681623 systemd[1]: var-lib-kubelet-pods-5c41a377\x2d2ae4\x2d44cc\x2dbcba\x2dd2dc8d1e9391-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dzmssj.mount: Deactivated successfully. Jan 14 01:04:34.681707 systemd[1]: var-lib-kubelet-pods-5c41a377\x2d2ae4\x2d44cc\x2dbcba\x2dd2dc8d1e9391-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Jan 14 01:04:35.000860 systemd[1]: Removed slice kubepods-besteffort-pod5c41a377_2ae4_44cc_bcba_d2dc8d1e9391.slice - libcontainer container kubepods-besteffort-pod5c41a377_2ae4_44cc_bcba_d2dc8d1e9391.slice. Jan 14 01:04:35.068012 systemd[1]: Created slice kubepods-besteffort-pod51002e26_f95b_49f1_8f48_be4a381935eb.slice - libcontainer container kubepods-besteffort-pod51002e26_f95b_49f1_8f48_be4a381935eb.slice. Jan 14 01:04:35.097109 kubelet[3316]: I0114 01:04:35.097073 3316 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/51002e26-f95b-49f1-8f48-be4a381935eb-whisker-backend-key-pair\") pod \"whisker-d94f85b8d-frfrp\" (UID: \"51002e26-f95b-49f1-8f48-be4a381935eb\") " pod="calico-system/whisker-d94f85b8d-frfrp" Jan 14 01:04:35.097109 kubelet[3316]: I0114 01:04:35.097116 3316 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51002e26-f95b-49f1-8f48-be4a381935eb-whisker-ca-bundle\") pod \"whisker-d94f85b8d-frfrp\" (UID: \"51002e26-f95b-49f1-8f48-be4a381935eb\") " pod="calico-system/whisker-d94f85b8d-frfrp" Jan 14 01:04:35.097467 kubelet[3316]: I0114 01:04:35.097130 3316 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6xmt\" (UniqueName: \"kubernetes.io/projected/51002e26-f95b-49f1-8f48-be4a381935eb-kube-api-access-h6xmt\") pod \"whisker-d94f85b8d-frfrp\" (UID: \"51002e26-f95b-49f1-8f48-be4a381935eb\") " pod="calico-system/whisker-d94f85b8d-frfrp" Jan 14 01:04:35.375780 containerd[1695]: time="2026-01-14T01:04:35.375303150Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-d94f85b8d-frfrp,Uid:51002e26-f95b-49f1-8f48-be4a381935eb,Namespace:calico-system,Attempt:0,}" Jan 14 01:04:35.667000 audit: BPF prog-id=179 op=LOAD Jan 14 01:04:35.669387 kernel: kauditd_printk_skb: 11 callbacks suppressed Jan 14 01:04:35.669440 kernel: audit: type=1334 audit(1768352675.667:591): prog-id=179 op=LOAD Jan 14 01:04:35.667000 audit[4557]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffea284fba0 a2=98 a3=1fffffffffffffff items=0 ppid=4451 pid=4557 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:35.672475 kernel: audit: type=1300 audit(1768352675.667:591): arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffea284fba0 a2=98 a3=1fffffffffffffff items=0 ppid=4451 pid=4557 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:35.675064 kernel: audit: type=1327 audit(1768352675.667:591): proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 14 01:04:35.667000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 14 01:04:35.669000 audit: BPF prog-id=179 op=UNLOAD Jan 14 01:04:35.669000 audit[4557]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7ffea284fb70 a3=0 items=0 ppid=4451 pid=4557 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:35.681997 kernel: audit: type=1334 audit(1768352675.669:592): prog-id=179 op=UNLOAD Jan 14 01:04:35.682070 kernel: audit: type=1300 audit(1768352675.669:592): arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7ffea284fb70 a3=0 items=0 ppid=4451 pid=4557 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:35.669000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 14 01:04:35.686031 kernel: audit: type=1327 audit(1768352675.669:592): proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 14 01:04:35.669000 audit: BPF prog-id=180 op=LOAD Jan 14 01:04:35.689254 kernel: audit: type=1334 audit(1768352675.669:593): prog-id=180 op=LOAD Jan 14 01:04:35.689276 kernel: audit: type=1300 audit(1768352675.669:593): arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffea284fa80 a2=94 a3=3 items=0 ppid=4451 pid=4557 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:35.669000 audit[4557]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffea284fa80 a2=94 a3=3 items=0 ppid=4451 pid=4557 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:35.669000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 14 01:04:35.694790 kernel: audit: type=1327 audit(1768352675.669:593): proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 14 01:04:35.669000 audit: BPF prog-id=180 op=UNLOAD Jan 14 01:04:35.697501 kernel: audit: type=1334 audit(1768352675.669:594): prog-id=180 op=UNLOAD Jan 14 01:04:35.669000 audit[4557]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7ffea284fa80 a2=94 a3=3 items=0 ppid=4451 pid=4557 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:35.669000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 14 01:04:35.669000 audit: BPF prog-id=181 op=LOAD Jan 14 01:04:35.669000 audit[4557]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffea284fac0 a2=94 a3=7ffea284fca0 items=0 ppid=4451 pid=4557 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:35.669000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 14 01:04:35.669000 audit: BPF prog-id=181 op=UNLOAD Jan 14 01:04:35.669000 audit[4557]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7ffea284fac0 a2=94 a3=7ffea284fca0 items=0 ppid=4451 pid=4557 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:35.669000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 14 01:04:35.670000 audit: BPF prog-id=182 op=LOAD Jan 14 01:04:35.670000 audit[4558]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffed2bb6570 a2=98 a3=3 items=0 ppid=4451 pid=4558 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:35.670000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 01:04:35.670000 audit: BPF prog-id=182 op=UNLOAD Jan 14 01:04:35.670000 audit[4558]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7ffed2bb6540 a3=0 items=0 ppid=4451 pid=4558 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:35.670000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 01:04:35.670000 audit: BPF prog-id=183 op=LOAD Jan 14 01:04:35.670000 audit[4558]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffed2bb6360 a2=94 a3=54428f items=0 ppid=4451 pid=4558 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:35.670000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 01:04:35.670000 audit: BPF prog-id=183 op=UNLOAD Jan 14 01:04:35.670000 audit[4558]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffed2bb6360 a2=94 a3=54428f items=0 ppid=4451 pid=4558 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:35.670000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 01:04:35.670000 audit: BPF prog-id=184 op=LOAD Jan 14 01:04:35.670000 audit[4558]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffed2bb6390 a2=94 a3=2 items=0 ppid=4451 pid=4558 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:35.670000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 01:04:35.670000 audit: BPF prog-id=184 op=UNLOAD Jan 14 01:04:35.670000 audit[4558]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffed2bb6390 a2=0 a3=2 items=0 ppid=4451 pid=4558 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:35.670000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 01:04:35.817000 audit: BPF prog-id=185 op=LOAD Jan 14 01:04:35.817000 audit[4558]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffed2bb6250 a2=94 a3=1 items=0 ppid=4451 pid=4558 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:35.817000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 01:04:35.817000 audit: BPF prog-id=185 op=UNLOAD Jan 14 01:04:35.817000 audit[4558]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffed2bb6250 a2=94 a3=1 items=0 ppid=4451 pid=4558 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:35.817000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 01:04:35.827000 audit: BPF prog-id=186 op=LOAD Jan 14 01:04:35.827000 audit[4558]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffed2bb6240 a2=94 a3=4 items=0 ppid=4451 pid=4558 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:35.827000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 01:04:35.827000 audit: BPF prog-id=186 op=UNLOAD Jan 14 01:04:35.827000 audit[4558]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=7ffed2bb6240 a2=0 a3=4 items=0 ppid=4451 pid=4558 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:35.827000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 01:04:35.827000 audit: BPF prog-id=187 op=LOAD Jan 14 01:04:35.827000 audit[4558]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffed2bb60a0 a2=94 a3=5 items=0 ppid=4451 pid=4558 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:35.827000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 01:04:35.827000 audit: BPF prog-id=187 op=UNLOAD Jan 14 01:04:35.827000 audit[4558]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7ffed2bb60a0 a2=0 a3=5 items=0 ppid=4451 pid=4558 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:35.827000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 01:04:35.827000 audit: BPF prog-id=188 op=LOAD Jan 14 01:04:35.827000 audit[4558]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffed2bb62c0 a2=94 a3=6 items=0 ppid=4451 pid=4558 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:35.827000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 01:04:35.827000 audit: BPF prog-id=188 op=UNLOAD Jan 14 01:04:35.827000 audit[4558]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=7ffed2bb62c0 a2=0 a3=6 items=0 ppid=4451 pid=4558 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:35.827000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 01:04:35.828000 audit: BPF prog-id=189 op=LOAD Jan 14 01:04:35.828000 audit[4558]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffed2bb5a70 a2=94 a3=88 items=0 ppid=4451 pid=4558 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:35.828000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 01:04:35.828000 audit: BPF prog-id=190 op=LOAD Jan 14 01:04:35.828000 audit[4558]: SYSCALL arch=c000003e syscall=321 success=yes exit=7 a0=5 a1=7ffed2bb58f0 a2=94 a3=2 items=0 ppid=4451 pid=4558 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:35.828000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 01:04:35.828000 audit: BPF prog-id=190 op=UNLOAD Jan 14 01:04:35.828000 audit[4558]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=7 a1=7ffed2bb5920 a2=0 a3=7ffed2bb5a20 items=0 ppid=4451 pid=4558 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:35.828000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 01:04:35.829000 audit: BPF prog-id=189 op=UNLOAD Jan 14 01:04:35.829000 audit[4558]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=1497d10 a2=0 a3=7207899720bdf0cc items=0 ppid=4451 pid=4558 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:35.829000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 01:04:35.835000 audit: BPF prog-id=191 op=LOAD Jan 14 01:04:35.835000 audit[4561]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffc4a423480 a2=98 a3=1999999999999999 items=0 ppid=4451 pid=4561 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:35.835000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 14 01:04:35.835000 audit: BPF prog-id=191 op=UNLOAD Jan 14 01:04:35.835000 audit[4561]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7ffc4a423450 a3=0 items=0 ppid=4451 pid=4561 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:35.835000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 14 01:04:35.835000 audit: BPF prog-id=192 op=LOAD Jan 14 01:04:35.835000 audit[4561]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffc4a423360 a2=94 a3=ffff items=0 ppid=4451 pid=4561 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:35.835000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 14 01:04:35.835000 audit: BPF prog-id=192 op=UNLOAD Jan 14 01:04:35.835000 audit[4561]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7ffc4a423360 a2=94 a3=ffff items=0 ppid=4451 pid=4561 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:35.835000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 14 01:04:35.835000 audit: BPF prog-id=193 op=LOAD Jan 14 01:04:35.835000 audit[4561]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffc4a4233a0 a2=94 a3=7ffc4a423580 items=0 ppid=4451 pid=4561 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:35.835000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 14 01:04:35.835000 audit: BPF prog-id=193 op=UNLOAD Jan 14 01:04:35.835000 audit[4561]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7ffc4a4233a0 a2=94 a3=7ffc4a423580 items=0 ppid=4451 pid=4561 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:35.835000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 14 01:04:35.857425 kubelet[3316]: I0114 01:04:35.857390 3316 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c41a377-2ae4-44cc-bcba-d2dc8d1e9391" path="/var/lib/kubelet/pods/5c41a377-2ae4-44cc-bcba-d2dc8d1e9391/volumes" Jan 14 01:04:36.166374 systemd-networkd[1385]: vxlan.calico: Link UP Jan 14 01:04:36.166380 systemd-networkd[1385]: vxlan.calico: Gained carrier Jan 14 01:04:36.219000 audit: BPF prog-id=194 op=LOAD Jan 14 01:04:36.219000 audit[4588]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffea7870340 a2=98 a3=0 items=0 ppid=4451 pid=4588 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:36.219000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 14 01:04:36.219000 audit: BPF prog-id=194 op=UNLOAD Jan 14 01:04:36.219000 audit[4588]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7ffea7870310 a3=0 items=0 ppid=4451 pid=4588 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:36.219000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 14 01:04:36.219000 audit: BPF prog-id=195 op=LOAD Jan 14 01:04:36.219000 audit[4588]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffea7870150 a2=94 a3=54428f items=0 ppid=4451 pid=4588 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:36.219000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 14 01:04:36.219000 audit: BPF prog-id=195 op=UNLOAD Jan 14 01:04:36.219000 audit[4588]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7ffea7870150 a2=94 a3=54428f items=0 ppid=4451 pid=4588 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:36.219000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 14 01:04:36.220000 audit: BPF prog-id=196 op=LOAD Jan 14 01:04:36.220000 audit[4588]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffea7870180 a2=94 a3=2 items=0 ppid=4451 pid=4588 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:36.220000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 14 01:04:36.220000 audit: BPF prog-id=196 op=UNLOAD Jan 14 01:04:36.220000 audit[4588]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7ffea7870180 a2=0 a3=2 items=0 ppid=4451 pid=4588 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:36.220000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 14 01:04:36.220000 audit: BPF prog-id=197 op=LOAD Jan 14 01:04:36.220000 audit[4588]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffea786ff30 a2=94 a3=4 items=0 ppid=4451 pid=4588 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:36.220000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 14 01:04:36.220000 audit: BPF prog-id=197 op=UNLOAD Jan 14 01:04:36.220000 audit[4588]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7ffea786ff30 a2=94 a3=4 items=0 ppid=4451 pid=4588 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:36.220000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 14 01:04:36.220000 audit: BPF prog-id=198 op=LOAD Jan 14 01:04:36.220000 audit[4588]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffea7870030 a2=94 a3=7ffea78701b0 items=0 ppid=4451 pid=4588 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:36.220000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 14 01:04:36.220000 audit: BPF prog-id=198 op=UNLOAD Jan 14 01:04:36.220000 audit[4588]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7ffea7870030 a2=0 a3=7ffea78701b0 items=0 ppid=4451 pid=4588 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:36.220000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 14 01:04:36.220000 audit: BPF prog-id=199 op=LOAD Jan 14 01:04:36.220000 audit[4588]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffea786f760 a2=94 a3=2 items=0 ppid=4451 pid=4588 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:36.220000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 14 01:04:36.220000 audit: BPF prog-id=199 op=UNLOAD Jan 14 01:04:36.220000 audit[4588]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7ffea786f760 a2=0 a3=2 items=0 ppid=4451 pid=4588 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:36.220000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 14 01:04:36.220000 audit: BPF prog-id=200 op=LOAD Jan 14 01:04:36.220000 audit[4588]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffea786f860 a2=94 a3=30 items=0 ppid=4451 pid=4588 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:36.220000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 14 01:04:36.230000 audit: BPF prog-id=201 op=LOAD Jan 14 01:04:36.230000 audit[4593]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7fff8f374d60 a2=98 a3=0 items=0 ppid=4451 pid=4593 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:36.230000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 01:04:36.231000 audit: BPF prog-id=201 op=UNLOAD Jan 14 01:04:36.231000 audit[4593]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7fff8f374d30 a3=0 items=0 ppid=4451 pid=4593 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:36.231000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 01:04:36.231000 audit: BPF prog-id=202 op=LOAD Jan 14 01:04:36.231000 audit[4593]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7fff8f374b50 a2=94 a3=54428f items=0 ppid=4451 pid=4593 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:36.231000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 01:04:36.231000 audit: BPF prog-id=202 op=UNLOAD Jan 14 01:04:36.231000 audit[4593]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7fff8f374b50 a2=94 a3=54428f items=0 ppid=4451 pid=4593 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:36.231000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 01:04:36.231000 audit: BPF prog-id=203 op=LOAD Jan 14 01:04:36.231000 audit[4593]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7fff8f374b80 a2=94 a3=2 items=0 ppid=4451 pid=4593 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:36.231000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 01:04:36.231000 audit: BPF prog-id=203 op=UNLOAD Jan 14 01:04:36.231000 audit[4593]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7fff8f374b80 a2=0 a3=2 items=0 ppid=4451 pid=4593 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:36.231000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 01:04:36.392000 audit: BPF prog-id=204 op=LOAD Jan 14 01:04:36.392000 audit[4593]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7fff8f374a40 a2=94 a3=1 items=0 ppid=4451 pid=4593 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:36.392000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 01:04:36.392000 audit: BPF prog-id=204 op=UNLOAD Jan 14 01:04:36.392000 audit[4593]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7fff8f374a40 a2=94 a3=1 items=0 ppid=4451 pid=4593 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:36.392000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 01:04:36.405000 audit: BPF prog-id=205 op=LOAD Jan 14 01:04:36.405000 audit[4593]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7fff8f374a30 a2=94 a3=4 items=0 ppid=4451 pid=4593 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:36.405000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 01:04:36.405000 audit: BPF prog-id=205 op=UNLOAD Jan 14 01:04:36.405000 audit[4593]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=7fff8f374a30 a2=0 a3=4 items=0 ppid=4451 pid=4593 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:36.405000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 01:04:36.405000 audit: BPF prog-id=206 op=LOAD Jan 14 01:04:36.405000 audit[4593]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7fff8f374890 a2=94 a3=5 items=0 ppid=4451 pid=4593 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:36.405000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 01:04:36.405000 audit: BPF prog-id=206 op=UNLOAD Jan 14 01:04:36.405000 audit[4593]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7fff8f374890 a2=0 a3=5 items=0 ppid=4451 pid=4593 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:36.405000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 01:04:36.406000 audit: BPF prog-id=207 op=LOAD Jan 14 01:04:36.406000 audit[4593]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7fff8f374ab0 a2=94 a3=6 items=0 ppid=4451 pid=4593 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:36.406000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 01:04:36.406000 audit: BPF prog-id=207 op=UNLOAD Jan 14 01:04:36.406000 audit[4593]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=7fff8f374ab0 a2=0 a3=6 items=0 ppid=4451 pid=4593 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:36.406000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 01:04:36.406000 audit: BPF prog-id=208 op=LOAD Jan 14 01:04:36.406000 audit[4593]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7fff8f374260 a2=94 a3=88 items=0 ppid=4451 pid=4593 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:36.406000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 01:04:36.406000 audit: BPF prog-id=209 op=LOAD Jan 14 01:04:36.406000 audit[4593]: SYSCALL arch=c000003e syscall=321 success=yes exit=7 a0=5 a1=7fff8f3740e0 a2=94 a3=2 items=0 ppid=4451 pid=4593 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:36.406000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 01:04:36.406000 audit: BPF prog-id=209 op=UNLOAD Jan 14 01:04:36.406000 audit[4593]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=7 a1=7fff8f374110 a2=0 a3=7fff8f374210 items=0 ppid=4451 pid=4593 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:36.406000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 01:04:36.406000 audit: BPF prog-id=208 op=UNLOAD Jan 14 01:04:36.406000 audit[4593]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=24a7fd10 a2=0 a3=b376e5e098eb022e items=0 ppid=4451 pid=4593 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:36.406000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 01:04:36.415000 audit: BPF prog-id=200 op=UNLOAD Jan 14 01:04:36.415000 audit[4451]: SYSCALL arch=c000003e syscall=263 success=yes exit=0 a0=ffffffffffffff9c a1=c001076100 a2=0 a3=0 items=0 ppid=4431 pid=4451 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="calico-node" exe="/usr/bin/calico-node" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:36.415000 audit: PROCTITLE proctitle=63616C69636F2D6E6F6465002D66656C6978 Jan 14 01:04:36.473004 systemd-networkd[1385]: calieb5108feda8: Link UP Jan 14 01:04:36.474812 systemd-networkd[1385]: calieb5108feda8: Gained carrier Jan 14 01:04:36.487000 audit[4622]: NETFILTER_CFG table=nat:117 family=2 entries=15 op=nft_register_chain pid=4622 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 14 01:04:36.487000 audit[4622]: SYSCALL arch=c000003e syscall=46 success=yes exit=5084 a0=3 a1=7ffcf1e24c20 a2=0 a3=7ffcf1e24c0c items=0 ppid=4451 pid=4622 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:36.487000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 14 01:04:36.493931 containerd[1695]: 2026-01-14 01:04:35.428 [INFO][4462] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 14 01:04:36.493931 containerd[1695]: 2026-01-14 01:04:36.379 [INFO][4462] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--n--de0c74fc75-k8s-whisker--d94f85b8d--frfrp-eth0 whisker-d94f85b8d- calico-system 51002e26-f95b-49f1-8f48-be4a381935eb 884 0 2026-01-14 01:04:35 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:d94f85b8d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4547-0-0-n-de0c74fc75 whisker-d94f85b8d-frfrp eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] calieb5108feda8 [] [] }} ContainerID="20bbac06aaaf81fccf636c1eaf2ed53d8786ed39b748ce5a6c809a0d07f74f21" Namespace="calico-system" Pod="whisker-d94f85b8d-frfrp" WorkloadEndpoint="ci--4547--0--0--n--de0c74fc75-k8s-whisker--d94f85b8d--frfrp-" Jan 14 01:04:36.493931 containerd[1695]: 2026-01-14 01:04:36.380 [INFO][4462] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="20bbac06aaaf81fccf636c1eaf2ed53d8786ed39b748ce5a6c809a0d07f74f21" Namespace="calico-system" Pod="whisker-d94f85b8d-frfrp" WorkloadEndpoint="ci--4547--0--0--n--de0c74fc75-k8s-whisker--d94f85b8d--frfrp-eth0" Jan 14 01:04:36.493931 containerd[1695]: 2026-01-14 01:04:36.411 [INFO][4597] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="20bbac06aaaf81fccf636c1eaf2ed53d8786ed39b748ce5a6c809a0d07f74f21" HandleID="k8s-pod-network.20bbac06aaaf81fccf636c1eaf2ed53d8786ed39b748ce5a6c809a0d07f74f21" Workload="ci--4547--0--0--n--de0c74fc75-k8s-whisker--d94f85b8d--frfrp-eth0" Jan 14 01:04:36.494944 containerd[1695]: 2026-01-14 01:04:36.411 [INFO][4597] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="20bbac06aaaf81fccf636c1eaf2ed53d8786ed39b748ce5a6c809a0d07f74f21" HandleID="k8s-pod-network.20bbac06aaaf81fccf636c1eaf2ed53d8786ed39b748ce5a6c809a0d07f74f21" Workload="ci--4547--0--0--n--de0c74fc75-k8s-whisker--d94f85b8d--frfrp-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d4fe0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4547-0-0-n-de0c74fc75", "pod":"whisker-d94f85b8d-frfrp", "timestamp":"2026-01-14 01:04:36.411653704 +0000 UTC"}, Hostname:"ci-4547-0-0-n-de0c74fc75", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 14 01:04:36.494944 containerd[1695]: 2026-01-14 01:04:36.411 [INFO][4597] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 14 01:04:36.494944 containerd[1695]: 2026-01-14 01:04:36.411 [INFO][4597] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 14 01:04:36.494944 containerd[1695]: 2026-01-14 01:04:36.411 [INFO][4597] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-n-de0c74fc75' Jan 14 01:04:36.494944 containerd[1695]: 2026-01-14 01:04:36.422 [INFO][4597] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.20bbac06aaaf81fccf636c1eaf2ed53d8786ed39b748ce5a6c809a0d07f74f21" host="ci-4547-0-0-n-de0c74fc75" Jan 14 01:04:36.494944 containerd[1695]: 2026-01-14 01:04:36.428 [INFO][4597] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-n-de0c74fc75" Jan 14 01:04:36.494944 containerd[1695]: 2026-01-14 01:04:36.432 [INFO][4597] ipam/ipam.go 511: Trying affinity for 192.168.31.0/26 host="ci-4547-0-0-n-de0c74fc75" Jan 14 01:04:36.494944 containerd[1695]: 2026-01-14 01:04:36.434 [INFO][4597] ipam/ipam.go 158: Attempting to load block cidr=192.168.31.0/26 host="ci-4547-0-0-n-de0c74fc75" Jan 14 01:04:36.494944 containerd[1695]: 2026-01-14 01:04:36.437 [INFO][4597] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.31.0/26 host="ci-4547-0-0-n-de0c74fc75" Jan 14 01:04:36.495137 containerd[1695]: 2026-01-14 01:04:36.437 [INFO][4597] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.31.0/26 handle="k8s-pod-network.20bbac06aaaf81fccf636c1eaf2ed53d8786ed39b748ce5a6c809a0d07f74f21" host="ci-4547-0-0-n-de0c74fc75" Jan 14 01:04:36.495137 containerd[1695]: 2026-01-14 01:04:36.439 [INFO][4597] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.20bbac06aaaf81fccf636c1eaf2ed53d8786ed39b748ce5a6c809a0d07f74f21 Jan 14 01:04:36.495137 containerd[1695]: 2026-01-14 01:04:36.444 [INFO][4597] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.31.0/26 handle="k8s-pod-network.20bbac06aaaf81fccf636c1eaf2ed53d8786ed39b748ce5a6c809a0d07f74f21" host="ci-4547-0-0-n-de0c74fc75" Jan 14 01:04:36.495137 containerd[1695]: 2026-01-14 01:04:36.451 [INFO][4597] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.31.1/26] block=192.168.31.0/26 handle="k8s-pod-network.20bbac06aaaf81fccf636c1eaf2ed53d8786ed39b748ce5a6c809a0d07f74f21" host="ci-4547-0-0-n-de0c74fc75" Jan 14 01:04:36.495137 containerd[1695]: 2026-01-14 01:04:36.451 [INFO][4597] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.31.1/26] handle="k8s-pod-network.20bbac06aaaf81fccf636c1eaf2ed53d8786ed39b748ce5a6c809a0d07f74f21" host="ci-4547-0-0-n-de0c74fc75" Jan 14 01:04:36.495137 containerd[1695]: 2026-01-14 01:04:36.451 [INFO][4597] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 14 01:04:36.495137 containerd[1695]: 2026-01-14 01:04:36.451 [INFO][4597] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.31.1/26] IPv6=[] ContainerID="20bbac06aaaf81fccf636c1eaf2ed53d8786ed39b748ce5a6c809a0d07f74f21" HandleID="k8s-pod-network.20bbac06aaaf81fccf636c1eaf2ed53d8786ed39b748ce5a6c809a0d07f74f21" Workload="ci--4547--0--0--n--de0c74fc75-k8s-whisker--d94f85b8d--frfrp-eth0" Jan 14 01:04:36.495257 containerd[1695]: 2026-01-14 01:04:36.454 [INFO][4462] cni-plugin/k8s.go 418: Populated endpoint ContainerID="20bbac06aaaf81fccf636c1eaf2ed53d8786ed39b748ce5a6c809a0d07f74f21" Namespace="calico-system" Pod="whisker-d94f85b8d-frfrp" WorkloadEndpoint="ci--4547--0--0--n--de0c74fc75-k8s-whisker--d94f85b8d--frfrp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--n--de0c74fc75-k8s-whisker--d94f85b8d--frfrp-eth0", GenerateName:"whisker-d94f85b8d-", Namespace:"calico-system", SelfLink:"", UID:"51002e26-f95b-49f1-8f48-be4a381935eb", ResourceVersion:"884", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 1, 4, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"d94f85b8d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-n-de0c74fc75", ContainerID:"", Pod:"whisker-d94f85b8d-frfrp", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.31.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calieb5108feda8", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 01:04:36.495257 containerd[1695]: 2026-01-14 01:04:36.454 [INFO][4462] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.31.1/32] ContainerID="20bbac06aaaf81fccf636c1eaf2ed53d8786ed39b748ce5a6c809a0d07f74f21" Namespace="calico-system" Pod="whisker-d94f85b8d-frfrp" WorkloadEndpoint="ci--4547--0--0--n--de0c74fc75-k8s-whisker--d94f85b8d--frfrp-eth0" Jan 14 01:04:36.495323 containerd[1695]: 2026-01-14 01:04:36.454 [INFO][4462] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calieb5108feda8 ContainerID="20bbac06aaaf81fccf636c1eaf2ed53d8786ed39b748ce5a6c809a0d07f74f21" Namespace="calico-system" Pod="whisker-d94f85b8d-frfrp" WorkloadEndpoint="ci--4547--0--0--n--de0c74fc75-k8s-whisker--d94f85b8d--frfrp-eth0" Jan 14 01:04:36.495323 containerd[1695]: 2026-01-14 01:04:36.475 [INFO][4462] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="20bbac06aaaf81fccf636c1eaf2ed53d8786ed39b748ce5a6c809a0d07f74f21" Namespace="calico-system" Pod="whisker-d94f85b8d-frfrp" WorkloadEndpoint="ci--4547--0--0--n--de0c74fc75-k8s-whisker--d94f85b8d--frfrp-eth0" Jan 14 01:04:36.495359 containerd[1695]: 2026-01-14 01:04:36.477 [INFO][4462] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="20bbac06aaaf81fccf636c1eaf2ed53d8786ed39b748ce5a6c809a0d07f74f21" Namespace="calico-system" Pod="whisker-d94f85b8d-frfrp" WorkloadEndpoint="ci--4547--0--0--n--de0c74fc75-k8s-whisker--d94f85b8d--frfrp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--n--de0c74fc75-k8s-whisker--d94f85b8d--frfrp-eth0", GenerateName:"whisker-d94f85b8d-", Namespace:"calico-system", SelfLink:"", UID:"51002e26-f95b-49f1-8f48-be4a381935eb", ResourceVersion:"884", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 1, 4, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"d94f85b8d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-n-de0c74fc75", ContainerID:"20bbac06aaaf81fccf636c1eaf2ed53d8786ed39b748ce5a6c809a0d07f74f21", Pod:"whisker-d94f85b8d-frfrp", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.31.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calieb5108feda8", MAC:"ce:2f:90:cc:1e:19", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 01:04:36.495405 containerd[1695]: 2026-01-14 01:04:36.491 [INFO][4462] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="20bbac06aaaf81fccf636c1eaf2ed53d8786ed39b748ce5a6c809a0d07f74f21" Namespace="calico-system" Pod="whisker-d94f85b8d-frfrp" WorkloadEndpoint="ci--4547--0--0--n--de0c74fc75-k8s-whisker--d94f85b8d--frfrp-eth0" Jan 14 01:04:36.503000 audit[4635]: NETFILTER_CFG table=mangle:118 family=2 entries=16 op=nft_register_chain pid=4635 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 14 01:04:36.503000 audit[4635]: SYSCALL arch=c000003e syscall=46 success=yes exit=6868 a0=3 a1=7fff63c0eee0 a2=0 a3=7fff63c0eecc items=0 ppid=4451 pid=4635 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:36.503000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 14 01:04:36.505000 audit[4629]: NETFILTER_CFG table=filter:119 family=2 entries=39 op=nft_register_chain pid=4629 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 14 01:04:36.505000 audit[4629]: SYSCALL arch=c000003e syscall=46 success=yes exit=18968 a0=3 a1=7ffe6f699d90 a2=0 a3=7ffe6f699d7c items=0 ppid=4451 pid=4629 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:36.505000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 14 01:04:36.507000 audit[4628]: NETFILTER_CFG table=raw:120 family=2 entries=21 op=nft_register_chain pid=4628 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 14 01:04:36.507000 audit[4628]: SYSCALL arch=c000003e syscall=46 success=yes exit=8452 a0=3 a1=7ffceb00a880 a2=0 a3=7ffceb00a86c items=0 ppid=4451 pid=4628 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:36.507000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 14 01:04:36.544055 containerd[1695]: time="2026-01-14T01:04:36.544015226Z" level=info msg="connecting to shim 20bbac06aaaf81fccf636c1eaf2ed53d8786ed39b748ce5a6c809a0d07f74f21" address="unix:///run/containerd/s/a3881eb2b55571cbb7f7eecd69a1c78ff69771b3cc5d7a46249d0003d104e521" namespace=k8s.io protocol=ttrpc version=3 Jan 14 01:04:36.537000 audit[4643]: NETFILTER_CFG table=filter:121 family=2 entries=59 op=nft_register_chain pid=4643 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 14 01:04:36.537000 audit[4643]: SYSCALL arch=c000003e syscall=46 success=yes exit=35860 a0=3 a1=7ffdec5a2430 a2=0 a3=7ffdec5a241c items=0 ppid=4451 pid=4643 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:36.537000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 14 01:04:36.571211 systemd[1]: Started cri-containerd-20bbac06aaaf81fccf636c1eaf2ed53d8786ed39b748ce5a6c809a0d07f74f21.scope - libcontainer container 20bbac06aaaf81fccf636c1eaf2ed53d8786ed39b748ce5a6c809a0d07f74f21. Jan 14 01:04:36.580000 audit: BPF prog-id=210 op=LOAD Jan 14 01:04:36.580000 audit: BPF prog-id=211 op=LOAD Jan 14 01:04:36.580000 audit[4667]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=4654 pid=4667 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:36.580000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3230626261633036616161663831666363663633366331656166326564 Jan 14 01:04:36.580000 audit: BPF prog-id=211 op=UNLOAD Jan 14 01:04:36.580000 audit[4667]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4654 pid=4667 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:36.580000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3230626261633036616161663831666363663633366331656166326564 Jan 14 01:04:36.581000 audit: BPF prog-id=212 op=LOAD Jan 14 01:04:36.581000 audit[4667]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=4654 pid=4667 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:36.581000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3230626261633036616161663831666363663633366331656166326564 Jan 14 01:04:36.581000 audit: BPF prog-id=213 op=LOAD Jan 14 01:04:36.581000 audit[4667]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=4654 pid=4667 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:36.581000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3230626261633036616161663831666363663633366331656166326564 Jan 14 01:04:36.581000 audit: BPF prog-id=213 op=UNLOAD Jan 14 01:04:36.581000 audit[4667]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=4654 pid=4667 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:36.581000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3230626261633036616161663831666363663633366331656166326564 Jan 14 01:04:36.581000 audit: BPF prog-id=212 op=UNLOAD Jan 14 01:04:36.581000 audit[4667]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4654 pid=4667 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:36.581000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3230626261633036616161663831666363663633366331656166326564 Jan 14 01:04:36.581000 audit: BPF prog-id=214 op=LOAD Jan 14 01:04:36.581000 audit[4667]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=4654 pid=4667 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:36.581000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3230626261633036616161663831666363663633366331656166326564 Jan 14 01:04:36.614005 containerd[1695]: time="2026-01-14T01:04:36.613949481Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-d94f85b8d-frfrp,Uid:51002e26-f95b-49f1-8f48-be4a381935eb,Namespace:calico-system,Attempt:0,} returns sandbox id \"20bbac06aaaf81fccf636c1eaf2ed53d8786ed39b748ce5a6c809a0d07f74f21\"" Jan 14 01:04:36.615592 containerd[1695]: time="2026-01-14T01:04:36.615402926Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 14 01:04:36.858509 containerd[1695]: time="2026-01-14T01:04:36.858474662Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-86656bdf75-c9kjr,Uid:81b57aeb-9645-4cab-a7a2-931a98fd5ce6,Namespace:calico-system,Attempt:0,}" Jan 14 01:04:36.959506 systemd-networkd[1385]: cali4d8537906b1: Link UP Jan 14 01:04:36.960161 systemd-networkd[1385]: cali4d8537906b1: Gained carrier Jan 14 01:04:36.971471 containerd[1695]: time="2026-01-14T01:04:36.971332440Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:04:36.973022 containerd[1695]: time="2026-01-14T01:04:36.972992247Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 14 01:04:36.973189 containerd[1695]: time="2026-01-14T01:04:36.973154456Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 14 01:04:36.973304 containerd[1695]: 2026-01-14 01:04:36.901 [INFO][4692] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--n--de0c74fc75-k8s-calico--kube--controllers--86656bdf75--c9kjr-eth0 calico-kube-controllers-86656bdf75- calico-system 81b57aeb-9645-4cab-a7a2-931a98fd5ce6 806 0 2026-01-14 01:04:13 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:86656bdf75 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4547-0-0-n-de0c74fc75 calico-kube-controllers-86656bdf75-c9kjr eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali4d8537906b1 [] [] }} ContainerID="ef08dc1e97cf380a7b172e3df2567cf6aeabb782f1eb83e29e1c08a579d4ae0b" Namespace="calico-system" Pod="calico-kube-controllers-86656bdf75-c9kjr" WorkloadEndpoint="ci--4547--0--0--n--de0c74fc75-k8s-calico--kube--controllers--86656bdf75--c9kjr-" Jan 14 01:04:36.973304 containerd[1695]: 2026-01-14 01:04:36.901 [INFO][4692] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="ef08dc1e97cf380a7b172e3df2567cf6aeabb782f1eb83e29e1c08a579d4ae0b" Namespace="calico-system" Pod="calico-kube-controllers-86656bdf75-c9kjr" WorkloadEndpoint="ci--4547--0--0--n--de0c74fc75-k8s-calico--kube--controllers--86656bdf75--c9kjr-eth0" Jan 14 01:04:36.973304 containerd[1695]: 2026-01-14 01:04:36.926 [INFO][4705] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ef08dc1e97cf380a7b172e3df2567cf6aeabb782f1eb83e29e1c08a579d4ae0b" HandleID="k8s-pod-network.ef08dc1e97cf380a7b172e3df2567cf6aeabb782f1eb83e29e1c08a579d4ae0b" Workload="ci--4547--0--0--n--de0c74fc75-k8s-calico--kube--controllers--86656bdf75--c9kjr-eth0" Jan 14 01:04:36.973415 containerd[1695]: 2026-01-14 01:04:36.926 [INFO][4705] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="ef08dc1e97cf380a7b172e3df2567cf6aeabb782f1eb83e29e1c08a579d4ae0b" HandleID="k8s-pod-network.ef08dc1e97cf380a7b172e3df2567cf6aeabb782f1eb83e29e1c08a579d4ae0b" Workload="ci--4547--0--0--n--de0c74fc75-k8s-calico--kube--controllers--86656bdf75--c9kjr-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5800), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4547-0-0-n-de0c74fc75", "pod":"calico-kube-controllers-86656bdf75-c9kjr", "timestamp":"2026-01-14 01:04:36.926566298 +0000 UTC"}, Hostname:"ci-4547-0-0-n-de0c74fc75", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 14 01:04:36.973415 containerd[1695]: 2026-01-14 01:04:36.926 [INFO][4705] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 14 01:04:36.973415 containerd[1695]: 2026-01-14 01:04:36.926 [INFO][4705] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 14 01:04:36.973415 containerd[1695]: 2026-01-14 01:04:36.926 [INFO][4705] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-n-de0c74fc75' Jan 14 01:04:36.973415 containerd[1695]: 2026-01-14 01:04:36.933 [INFO][4705] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.ef08dc1e97cf380a7b172e3df2567cf6aeabb782f1eb83e29e1c08a579d4ae0b" host="ci-4547-0-0-n-de0c74fc75" Jan 14 01:04:36.973415 containerd[1695]: 2026-01-14 01:04:36.937 [INFO][4705] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-n-de0c74fc75" Jan 14 01:04:36.973415 containerd[1695]: 2026-01-14 01:04:36.940 [INFO][4705] ipam/ipam.go 511: Trying affinity for 192.168.31.0/26 host="ci-4547-0-0-n-de0c74fc75" Jan 14 01:04:36.973415 containerd[1695]: 2026-01-14 01:04:36.942 [INFO][4705] ipam/ipam.go 158: Attempting to load block cidr=192.168.31.0/26 host="ci-4547-0-0-n-de0c74fc75" Jan 14 01:04:36.973415 containerd[1695]: 2026-01-14 01:04:36.943 [INFO][4705] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.31.0/26 host="ci-4547-0-0-n-de0c74fc75" Jan 14 01:04:36.973785 containerd[1695]: 2026-01-14 01:04:36.943 [INFO][4705] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.31.0/26 handle="k8s-pod-network.ef08dc1e97cf380a7b172e3df2567cf6aeabb782f1eb83e29e1c08a579d4ae0b" host="ci-4547-0-0-n-de0c74fc75" Jan 14 01:04:36.973785 containerd[1695]: 2026-01-14 01:04:36.944 [INFO][4705] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.ef08dc1e97cf380a7b172e3df2567cf6aeabb782f1eb83e29e1c08a579d4ae0b Jan 14 01:04:36.973785 containerd[1695]: 2026-01-14 01:04:36.948 [INFO][4705] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.31.0/26 handle="k8s-pod-network.ef08dc1e97cf380a7b172e3df2567cf6aeabb782f1eb83e29e1c08a579d4ae0b" host="ci-4547-0-0-n-de0c74fc75" Jan 14 01:04:36.973785 containerd[1695]: 2026-01-14 01:04:36.955 [INFO][4705] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.31.2/26] block=192.168.31.0/26 handle="k8s-pod-network.ef08dc1e97cf380a7b172e3df2567cf6aeabb782f1eb83e29e1c08a579d4ae0b" host="ci-4547-0-0-n-de0c74fc75" Jan 14 01:04:36.973785 containerd[1695]: 2026-01-14 01:04:36.955 [INFO][4705] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.31.2/26] handle="k8s-pod-network.ef08dc1e97cf380a7b172e3df2567cf6aeabb782f1eb83e29e1c08a579d4ae0b" host="ci-4547-0-0-n-de0c74fc75" Jan 14 01:04:36.973785 containerd[1695]: 2026-01-14 01:04:36.955 [INFO][4705] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 14 01:04:36.973785 containerd[1695]: 2026-01-14 01:04:36.955 [INFO][4705] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.31.2/26] IPv6=[] ContainerID="ef08dc1e97cf380a7b172e3df2567cf6aeabb782f1eb83e29e1c08a579d4ae0b" HandleID="k8s-pod-network.ef08dc1e97cf380a7b172e3df2567cf6aeabb782f1eb83e29e1c08a579d4ae0b" Workload="ci--4547--0--0--n--de0c74fc75-k8s-calico--kube--controllers--86656bdf75--c9kjr-eth0" Jan 14 01:04:36.973924 kubelet[3316]: E0114 01:04:36.973643 3316 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 14 01:04:36.974228 containerd[1695]: 2026-01-14 01:04:36.956 [INFO][4692] cni-plugin/k8s.go 418: Populated endpoint ContainerID="ef08dc1e97cf380a7b172e3df2567cf6aeabb782f1eb83e29e1c08a579d4ae0b" Namespace="calico-system" Pod="calico-kube-controllers-86656bdf75-c9kjr" WorkloadEndpoint="ci--4547--0--0--n--de0c74fc75-k8s-calico--kube--controllers--86656bdf75--c9kjr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--n--de0c74fc75-k8s-calico--kube--controllers--86656bdf75--c9kjr-eth0", GenerateName:"calico-kube-controllers-86656bdf75-", Namespace:"calico-system", SelfLink:"", UID:"81b57aeb-9645-4cab-a7a2-931a98fd5ce6", ResourceVersion:"806", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 1, 4, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"86656bdf75", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-n-de0c74fc75", ContainerID:"", Pod:"calico-kube-controllers-86656bdf75-c9kjr", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.31.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali4d8537906b1", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 01:04:36.974292 containerd[1695]: 2026-01-14 01:04:36.957 [INFO][4692] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.31.2/32] ContainerID="ef08dc1e97cf380a7b172e3df2567cf6aeabb782f1eb83e29e1c08a579d4ae0b" Namespace="calico-system" Pod="calico-kube-controllers-86656bdf75-c9kjr" WorkloadEndpoint="ci--4547--0--0--n--de0c74fc75-k8s-calico--kube--controllers--86656bdf75--c9kjr-eth0" Jan 14 01:04:36.974292 containerd[1695]: 2026-01-14 01:04:36.957 [INFO][4692] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali4d8537906b1 ContainerID="ef08dc1e97cf380a7b172e3df2567cf6aeabb782f1eb83e29e1c08a579d4ae0b" Namespace="calico-system" Pod="calico-kube-controllers-86656bdf75-c9kjr" WorkloadEndpoint="ci--4547--0--0--n--de0c74fc75-k8s-calico--kube--controllers--86656bdf75--c9kjr-eth0" Jan 14 01:04:36.974292 containerd[1695]: 2026-01-14 01:04:36.960 [INFO][4692] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ef08dc1e97cf380a7b172e3df2567cf6aeabb782f1eb83e29e1c08a579d4ae0b" Namespace="calico-system" Pod="calico-kube-controllers-86656bdf75-c9kjr" WorkloadEndpoint="ci--4547--0--0--n--de0c74fc75-k8s-calico--kube--controllers--86656bdf75--c9kjr-eth0" Jan 14 01:04:36.974948 containerd[1695]: 2026-01-14 01:04:36.960 [INFO][4692] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="ef08dc1e97cf380a7b172e3df2567cf6aeabb782f1eb83e29e1c08a579d4ae0b" Namespace="calico-system" Pod="calico-kube-controllers-86656bdf75-c9kjr" WorkloadEndpoint="ci--4547--0--0--n--de0c74fc75-k8s-calico--kube--controllers--86656bdf75--c9kjr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--n--de0c74fc75-k8s-calico--kube--controllers--86656bdf75--c9kjr-eth0", GenerateName:"calico-kube-controllers-86656bdf75-", Namespace:"calico-system", SelfLink:"", UID:"81b57aeb-9645-4cab-a7a2-931a98fd5ce6", ResourceVersion:"806", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 1, 4, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"86656bdf75", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-n-de0c74fc75", ContainerID:"ef08dc1e97cf380a7b172e3df2567cf6aeabb782f1eb83e29e1c08a579d4ae0b", Pod:"calico-kube-controllers-86656bdf75-c9kjr", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.31.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali4d8537906b1", MAC:"02:c0:18:39:f5:8b", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 01:04:36.975008 kubelet[3316]: E0114 01:04:36.974598 3316 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 14 01:04:36.975039 containerd[1695]: 2026-01-14 01:04:36.971 [INFO][4692] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="ef08dc1e97cf380a7b172e3df2567cf6aeabb782f1eb83e29e1c08a579d4ae0b" Namespace="calico-system" Pod="calico-kube-controllers-86656bdf75-c9kjr" WorkloadEndpoint="ci--4547--0--0--n--de0c74fc75-k8s-calico--kube--controllers--86656bdf75--c9kjr-eth0" Jan 14 01:04:36.975295 kubelet[3316]: E0114 01:04:36.974701 3316 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-d94f85b8d-frfrp_calico-system(51002e26-f95b-49f1-8f48-be4a381935eb): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 14 01:04:36.976785 containerd[1695]: time="2026-01-14T01:04:36.976760274Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 14 01:04:36.989000 audit[4718]: NETFILTER_CFG table=filter:122 family=2 entries=36 op=nft_register_chain pid=4718 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 14 01:04:36.989000 audit[4718]: SYSCALL arch=c000003e syscall=46 success=yes exit=19576 a0=3 a1=7fff1d2ccb10 a2=0 a3=7fff1d2ccafc items=0 ppid=4451 pid=4718 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:36.989000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 14 01:04:37.007203 containerd[1695]: time="2026-01-14T01:04:37.007168534Z" level=info msg="connecting to shim ef08dc1e97cf380a7b172e3df2567cf6aeabb782f1eb83e29e1c08a579d4ae0b" address="unix:///run/containerd/s/b2a66a0b69cefd16e3b1d23d5b22abe801927c0192b17162519d6c18266183a5" namespace=k8s.io protocol=ttrpc version=3 Jan 14 01:04:37.038278 systemd[1]: Started cri-containerd-ef08dc1e97cf380a7b172e3df2567cf6aeabb782f1eb83e29e1c08a579d4ae0b.scope - libcontainer container ef08dc1e97cf380a7b172e3df2567cf6aeabb782f1eb83e29e1c08a579d4ae0b. Jan 14 01:04:37.047000 audit: BPF prog-id=215 op=LOAD Jan 14 01:04:37.047000 audit: BPF prog-id=216 op=LOAD Jan 14 01:04:37.047000 audit[4739]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000106238 a2=98 a3=0 items=0 ppid=4728 pid=4739 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:37.047000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6566303864633165393763663338306137623137326533646632353637 Jan 14 01:04:37.047000 audit: BPF prog-id=216 op=UNLOAD Jan 14 01:04:37.047000 audit[4739]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4728 pid=4739 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:37.047000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6566303864633165393763663338306137623137326533646632353637 Jan 14 01:04:37.048000 audit: BPF prog-id=217 op=LOAD Jan 14 01:04:37.048000 audit[4739]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000106488 a2=98 a3=0 items=0 ppid=4728 pid=4739 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:37.048000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6566303864633165393763663338306137623137326533646632353637 Jan 14 01:04:37.048000 audit: BPF prog-id=218 op=LOAD Jan 14 01:04:37.048000 audit[4739]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000106218 a2=98 a3=0 items=0 ppid=4728 pid=4739 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:37.048000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6566303864633165393763663338306137623137326533646632353637 Jan 14 01:04:37.048000 audit: BPF prog-id=218 op=UNLOAD Jan 14 01:04:37.048000 audit[4739]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4728 pid=4739 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:37.048000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6566303864633165393763663338306137623137326533646632353637 Jan 14 01:04:37.048000 audit: BPF prog-id=217 op=UNLOAD Jan 14 01:04:37.048000 audit[4739]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4728 pid=4739 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:37.048000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6566303864633165393763663338306137623137326533646632353637 Jan 14 01:04:37.048000 audit: BPF prog-id=219 op=LOAD Jan 14 01:04:37.048000 audit[4739]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001066e8 a2=98 a3=0 items=0 ppid=4728 pid=4739 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:37.048000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6566303864633165393763663338306137623137326533646632353637 Jan 14 01:04:37.081491 containerd[1695]: time="2026-01-14T01:04:37.081461134Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-86656bdf75-c9kjr,Uid:81b57aeb-9645-4cab-a7a2-931a98fd5ce6,Namespace:calico-system,Attempt:0,} returns sandbox id \"ef08dc1e97cf380a7b172e3df2567cf6aeabb782f1eb83e29e1c08a579d4ae0b\"" Jan 14 01:04:37.305794 containerd[1695]: time="2026-01-14T01:04:37.305690213Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:04:37.308778 containerd[1695]: time="2026-01-14T01:04:37.308683380Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 14 01:04:37.308778 containerd[1695]: time="2026-01-14T01:04:37.308753376Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 14 01:04:37.308998 kubelet[3316]: E0114 01:04:37.308879 3316 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 14 01:04:37.308998 kubelet[3316]: E0114 01:04:37.308944 3316 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 14 01:04:37.309277 kubelet[3316]: E0114 01:04:37.309115 3316 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-d94f85b8d-frfrp_calico-system(51002e26-f95b-49f1-8f48-be4a381935eb): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 14 01:04:37.309277 kubelet[3316]: E0114 01:04:37.309154 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-d94f85b8d-frfrp" podUID="51002e26-f95b-49f1-8f48-be4a381935eb" Jan 14 01:04:37.310602 containerd[1695]: time="2026-01-14T01:04:37.310432407Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 14 01:04:37.332913 systemd-networkd[1385]: vxlan.calico: Gained IPv6LL Jan 14 01:04:37.645742 containerd[1695]: time="2026-01-14T01:04:37.645640840Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:04:37.647269 containerd[1695]: time="2026-01-14T01:04:37.647235886Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 14 01:04:37.647343 containerd[1695]: time="2026-01-14T01:04:37.647290493Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 14 01:04:37.647486 kubelet[3316]: E0114 01:04:37.647454 3316 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 14 01:04:37.647526 kubelet[3316]: E0114 01:04:37.647493 3316 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 14 01:04:37.647585 kubelet[3316]: E0114 01:04:37.647554 3316 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-86656bdf75-c9kjr_calico-system(81b57aeb-9645-4cab-a7a2-931a98fd5ce6): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 14 01:04:37.647616 kubelet[3316]: E0114 01:04:37.647600 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-86656bdf75-c9kjr" podUID="81b57aeb-9645-4cab-a7a2-931a98fd5ce6" Jan 14 01:04:37.859994 containerd[1695]: time="2026-01-14T01:04:37.859961986Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-c82bn,Uid:7f235fe5-2cf3-4556-b562-f0359308c37e,Namespace:kube-system,Attempt:0,}" Jan 14 01:04:37.862289 containerd[1695]: time="2026-01-14T01:04:37.862268777Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-gdh9l,Uid:4395ad87-346f-47f3-8e06-f63944f13a5d,Namespace:calico-system,Attempt:0,}" Jan 14 01:04:37.970850 systemd-networkd[1385]: calida71fc46531: Link UP Jan 14 01:04:37.973613 systemd-networkd[1385]: calida71fc46531: Gained carrier Jan 14 01:04:37.987328 containerd[1695]: 2026-01-14 01:04:37.908 [INFO][4777] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--n--de0c74fc75-k8s-csi--node--driver--gdh9l-eth0 csi-node-driver- calico-system 4395ad87-346f-47f3-8e06-f63944f13a5d 698 0 2026-01-14 01:04:13 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:9d99788f7 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4547-0-0-n-de0c74fc75 csi-node-driver-gdh9l eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calida71fc46531 [] [] }} ContainerID="3bba22cc51f68d20b1a3cc26c10df42d95737bbc11f0cedc06c46e6c2da9d3ba" Namespace="calico-system" Pod="csi-node-driver-gdh9l" WorkloadEndpoint="ci--4547--0--0--n--de0c74fc75-k8s-csi--node--driver--gdh9l-" Jan 14 01:04:37.987328 containerd[1695]: 2026-01-14 01:04:37.908 [INFO][4777] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="3bba22cc51f68d20b1a3cc26c10df42d95737bbc11f0cedc06c46e6c2da9d3ba" Namespace="calico-system" Pod="csi-node-driver-gdh9l" WorkloadEndpoint="ci--4547--0--0--n--de0c74fc75-k8s-csi--node--driver--gdh9l-eth0" Jan 14 01:04:37.987328 containerd[1695]: 2026-01-14 01:04:37.935 [INFO][4791] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="3bba22cc51f68d20b1a3cc26c10df42d95737bbc11f0cedc06c46e6c2da9d3ba" HandleID="k8s-pod-network.3bba22cc51f68d20b1a3cc26c10df42d95737bbc11f0cedc06c46e6c2da9d3ba" Workload="ci--4547--0--0--n--de0c74fc75-k8s-csi--node--driver--gdh9l-eth0" Jan 14 01:04:37.988378 containerd[1695]: 2026-01-14 01:04:37.935 [INFO][4791] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="3bba22cc51f68d20b1a3cc26c10df42d95737bbc11f0cedc06c46e6c2da9d3ba" HandleID="k8s-pod-network.3bba22cc51f68d20b1a3cc26c10df42d95737bbc11f0cedc06c46e6c2da9d3ba" Workload="ci--4547--0--0--n--de0c74fc75-k8s-csi--node--driver--gdh9l-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024efd0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4547-0-0-n-de0c74fc75", "pod":"csi-node-driver-gdh9l", "timestamp":"2026-01-14 01:04:37.935835389 +0000 UTC"}, Hostname:"ci-4547-0-0-n-de0c74fc75", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 14 01:04:37.988378 containerd[1695]: 2026-01-14 01:04:37.936 [INFO][4791] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 14 01:04:37.988378 containerd[1695]: 2026-01-14 01:04:37.936 [INFO][4791] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 14 01:04:37.988378 containerd[1695]: 2026-01-14 01:04:37.936 [INFO][4791] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-n-de0c74fc75' Jan 14 01:04:37.988378 containerd[1695]: 2026-01-14 01:04:37.942 [INFO][4791] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.3bba22cc51f68d20b1a3cc26c10df42d95737bbc11f0cedc06c46e6c2da9d3ba" host="ci-4547-0-0-n-de0c74fc75" Jan 14 01:04:37.988378 containerd[1695]: 2026-01-14 01:04:37.946 [INFO][4791] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-n-de0c74fc75" Jan 14 01:04:37.988378 containerd[1695]: 2026-01-14 01:04:37.950 [INFO][4791] ipam/ipam.go 511: Trying affinity for 192.168.31.0/26 host="ci-4547-0-0-n-de0c74fc75" Jan 14 01:04:37.988378 containerd[1695]: 2026-01-14 01:04:37.952 [INFO][4791] ipam/ipam.go 158: Attempting to load block cidr=192.168.31.0/26 host="ci-4547-0-0-n-de0c74fc75" Jan 14 01:04:37.988378 containerd[1695]: 2026-01-14 01:04:37.954 [INFO][4791] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.31.0/26 host="ci-4547-0-0-n-de0c74fc75" Jan 14 01:04:37.988567 containerd[1695]: 2026-01-14 01:04:37.954 [INFO][4791] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.31.0/26 handle="k8s-pod-network.3bba22cc51f68d20b1a3cc26c10df42d95737bbc11f0cedc06c46e6c2da9d3ba" host="ci-4547-0-0-n-de0c74fc75" Jan 14 01:04:37.988567 containerd[1695]: 2026-01-14 01:04:37.955 [INFO][4791] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.3bba22cc51f68d20b1a3cc26c10df42d95737bbc11f0cedc06c46e6c2da9d3ba Jan 14 01:04:37.988567 containerd[1695]: 2026-01-14 01:04:37.958 [INFO][4791] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.31.0/26 handle="k8s-pod-network.3bba22cc51f68d20b1a3cc26c10df42d95737bbc11f0cedc06c46e6c2da9d3ba" host="ci-4547-0-0-n-de0c74fc75" Jan 14 01:04:37.988567 containerd[1695]: 2026-01-14 01:04:37.964 [INFO][4791] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.31.3/26] block=192.168.31.0/26 handle="k8s-pod-network.3bba22cc51f68d20b1a3cc26c10df42d95737bbc11f0cedc06c46e6c2da9d3ba" host="ci-4547-0-0-n-de0c74fc75" Jan 14 01:04:37.988567 containerd[1695]: 2026-01-14 01:04:37.964 [INFO][4791] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.31.3/26] handle="k8s-pod-network.3bba22cc51f68d20b1a3cc26c10df42d95737bbc11f0cedc06c46e6c2da9d3ba" host="ci-4547-0-0-n-de0c74fc75" Jan 14 01:04:37.988567 containerd[1695]: 2026-01-14 01:04:37.964 [INFO][4791] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 14 01:04:37.988567 containerd[1695]: 2026-01-14 01:04:37.964 [INFO][4791] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.31.3/26] IPv6=[] ContainerID="3bba22cc51f68d20b1a3cc26c10df42d95737bbc11f0cedc06c46e6c2da9d3ba" HandleID="k8s-pod-network.3bba22cc51f68d20b1a3cc26c10df42d95737bbc11f0cedc06c46e6c2da9d3ba" Workload="ci--4547--0--0--n--de0c74fc75-k8s-csi--node--driver--gdh9l-eth0" Jan 14 01:04:37.989234 containerd[1695]: 2026-01-14 01:04:37.966 [INFO][4777] cni-plugin/k8s.go 418: Populated endpoint ContainerID="3bba22cc51f68d20b1a3cc26c10df42d95737bbc11f0cedc06c46e6c2da9d3ba" Namespace="calico-system" Pod="csi-node-driver-gdh9l" WorkloadEndpoint="ci--4547--0--0--n--de0c74fc75-k8s-csi--node--driver--gdh9l-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--n--de0c74fc75-k8s-csi--node--driver--gdh9l-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"4395ad87-346f-47f3-8e06-f63944f13a5d", ResourceVersion:"698", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 1, 4, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"9d99788f7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-n-de0c74fc75", ContainerID:"", Pod:"csi-node-driver-gdh9l", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.31.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calida71fc46531", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 01:04:37.989331 containerd[1695]: 2026-01-14 01:04:37.966 [INFO][4777] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.31.3/32] ContainerID="3bba22cc51f68d20b1a3cc26c10df42d95737bbc11f0cedc06c46e6c2da9d3ba" Namespace="calico-system" Pod="csi-node-driver-gdh9l" WorkloadEndpoint="ci--4547--0--0--n--de0c74fc75-k8s-csi--node--driver--gdh9l-eth0" Jan 14 01:04:37.989331 containerd[1695]: 2026-01-14 01:04:37.966 [INFO][4777] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calida71fc46531 ContainerID="3bba22cc51f68d20b1a3cc26c10df42d95737bbc11f0cedc06c46e6c2da9d3ba" Namespace="calico-system" Pod="csi-node-driver-gdh9l" WorkloadEndpoint="ci--4547--0--0--n--de0c74fc75-k8s-csi--node--driver--gdh9l-eth0" Jan 14 01:04:37.989331 containerd[1695]: 2026-01-14 01:04:37.975 [INFO][4777] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="3bba22cc51f68d20b1a3cc26c10df42d95737bbc11f0cedc06c46e6c2da9d3ba" Namespace="calico-system" Pod="csi-node-driver-gdh9l" WorkloadEndpoint="ci--4547--0--0--n--de0c74fc75-k8s-csi--node--driver--gdh9l-eth0" Jan 14 01:04:37.989433 containerd[1695]: 2026-01-14 01:04:37.976 [INFO][4777] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="3bba22cc51f68d20b1a3cc26c10df42d95737bbc11f0cedc06c46e6c2da9d3ba" Namespace="calico-system" Pod="csi-node-driver-gdh9l" WorkloadEndpoint="ci--4547--0--0--n--de0c74fc75-k8s-csi--node--driver--gdh9l-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--n--de0c74fc75-k8s-csi--node--driver--gdh9l-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"4395ad87-346f-47f3-8e06-f63944f13a5d", ResourceVersion:"698", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 1, 4, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"9d99788f7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-n-de0c74fc75", ContainerID:"3bba22cc51f68d20b1a3cc26c10df42d95737bbc11f0cedc06c46e6c2da9d3ba", Pod:"csi-node-driver-gdh9l", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.31.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calida71fc46531", MAC:"6e:f5:02:6b:66:76", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 01:04:37.989578 containerd[1695]: 2026-01-14 01:04:37.986 [INFO][4777] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="3bba22cc51f68d20b1a3cc26c10df42d95737bbc11f0cedc06c46e6c2da9d3ba" Namespace="calico-system" Pod="csi-node-driver-gdh9l" WorkloadEndpoint="ci--4547--0--0--n--de0c74fc75-k8s-csi--node--driver--gdh9l-eth0" Jan 14 01:04:38.000000 audit[4813]: NETFILTER_CFG table=filter:123 family=2 entries=40 op=nft_register_chain pid=4813 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 14 01:04:38.000000 audit[4813]: SYSCALL arch=c000003e syscall=46 success=yes exit=20764 a0=3 a1=7fffb239eef0 a2=0 a3=7fffb239eedc items=0 ppid=4451 pid=4813 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:38.000000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 14 01:04:38.014851 kubelet[3316]: E0114 01:04:38.011559 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-86656bdf75-c9kjr" podUID="81b57aeb-9645-4cab-a7a2-931a98fd5ce6" Jan 14 01:04:38.019352 kubelet[3316]: E0114 01:04:38.019322 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-d94f85b8d-frfrp" podUID="51002e26-f95b-49f1-8f48-be4a381935eb" Jan 14 01:04:38.024171 containerd[1695]: time="2026-01-14T01:04:38.024140043Z" level=info msg="connecting to shim 3bba22cc51f68d20b1a3cc26c10df42d95737bbc11f0cedc06c46e6c2da9d3ba" address="unix:///run/containerd/s/6f7ad5da5c75232bb7846f67801aeb3b9ad502a7db280310206af74b1e18b1ac" namespace=k8s.io protocol=ttrpc version=3 Jan 14 01:04:38.056272 systemd[1]: Started cri-containerd-3bba22cc51f68d20b1a3cc26c10df42d95737bbc11f0cedc06c46e6c2da9d3ba.scope - libcontainer container 3bba22cc51f68d20b1a3cc26c10df42d95737bbc11f0cedc06c46e6c2da9d3ba. Jan 14 01:04:38.067000 audit: BPF prog-id=220 op=LOAD Jan 14 01:04:38.068000 audit: BPF prog-id=221 op=LOAD Jan 14 01:04:38.068000 audit[4835]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=4823 pid=4835 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:38.068000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3362626132326363353166363864323062316133636332366331306466 Jan 14 01:04:38.068000 audit: BPF prog-id=221 op=UNLOAD Jan 14 01:04:38.068000 audit[4835]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4823 pid=4835 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:38.068000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3362626132326363353166363864323062316133636332366331306466 Jan 14 01:04:38.068000 audit: BPF prog-id=222 op=LOAD Jan 14 01:04:38.068000 audit[4835]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=4823 pid=4835 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:38.068000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3362626132326363353166363864323062316133636332366331306466 Jan 14 01:04:38.068000 audit: BPF prog-id=223 op=LOAD Jan 14 01:04:38.068000 audit[4835]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=4823 pid=4835 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:38.068000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3362626132326363353166363864323062316133636332366331306466 Jan 14 01:04:38.068000 audit: BPF prog-id=223 op=UNLOAD Jan 14 01:04:38.068000 audit[4835]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4823 pid=4835 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:38.068000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3362626132326363353166363864323062316133636332366331306466 Jan 14 01:04:38.068000 audit: BPF prog-id=222 op=UNLOAD Jan 14 01:04:38.068000 audit[4835]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4823 pid=4835 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:38.068000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3362626132326363353166363864323062316133636332366331306466 Jan 14 01:04:38.069000 audit: BPF prog-id=224 op=LOAD Jan 14 01:04:38.069000 audit[4835]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=4823 pid=4835 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:38.069000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3362626132326363353166363864323062316133636332366331306466 Jan 14 01:04:38.094500 containerd[1695]: time="2026-01-14T01:04:38.094473705Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-gdh9l,Uid:4395ad87-346f-47f3-8e06-f63944f13a5d,Namespace:calico-system,Attempt:0,} returns sandbox id \"3bba22cc51f68d20b1a3cc26c10df42d95737bbc11f0cedc06c46e6c2da9d3ba\"" Jan 14 01:04:38.097738 containerd[1695]: time="2026-01-14T01:04:38.097145890Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 14 01:04:38.102000 audit[4862]: NETFILTER_CFG table=filter:124 family=2 entries=20 op=nft_register_rule pid=4862 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:04:38.102681 systemd-networkd[1385]: cali0f5b3d82ccc: Link UP Jan 14 01:04:38.102853 systemd-networkd[1385]: cali0f5b3d82ccc: Gained carrier Jan 14 01:04:38.102000 audit[4862]: SYSCALL arch=c000003e syscall=46 success=yes exit=7480 a0=3 a1=7ffc6f1c0fc0 a2=0 a3=7ffc6f1c0fac items=0 ppid=3467 pid=4862 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:38.102000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:04:38.106000 audit[4862]: NETFILTER_CFG table=nat:125 family=2 entries=14 op=nft_register_rule pid=4862 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:04:38.106000 audit[4862]: SYSCALL arch=c000003e syscall=46 success=yes exit=3468 a0=3 a1=7ffc6f1c0fc0 a2=0 a3=0 items=0 ppid=3467 pid=4862 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:38.106000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:04:38.119322 containerd[1695]: 2026-01-14 01:04:37.926 [INFO][4768] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--n--de0c74fc75-k8s-coredns--66bc5c9577--c82bn-eth0 coredns-66bc5c9577- kube-system 7f235fe5-2cf3-4556-b562-f0359308c37e 811 0 2026-01-14 01:03:59 +0000 UTC map[k8s-app:kube-dns pod-template-hash:66bc5c9577 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4547-0-0-n-de0c74fc75 coredns-66bc5c9577-c82bn eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali0f5b3d82ccc [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="fb2000b71b00dc42be88efcddee2b9cc3d02a3e37b7619872ab05122c2f67dc9" Namespace="kube-system" Pod="coredns-66bc5c9577-c82bn" WorkloadEndpoint="ci--4547--0--0--n--de0c74fc75-k8s-coredns--66bc5c9577--c82bn-" Jan 14 01:04:38.119322 containerd[1695]: 2026-01-14 01:04:37.926 [INFO][4768] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="fb2000b71b00dc42be88efcddee2b9cc3d02a3e37b7619872ab05122c2f67dc9" Namespace="kube-system" Pod="coredns-66bc5c9577-c82bn" WorkloadEndpoint="ci--4547--0--0--n--de0c74fc75-k8s-coredns--66bc5c9577--c82bn-eth0" Jan 14 01:04:38.119322 containerd[1695]: 2026-01-14 01:04:37.951 [INFO][4797] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="fb2000b71b00dc42be88efcddee2b9cc3d02a3e37b7619872ab05122c2f67dc9" HandleID="k8s-pod-network.fb2000b71b00dc42be88efcddee2b9cc3d02a3e37b7619872ab05122c2f67dc9" Workload="ci--4547--0--0--n--de0c74fc75-k8s-coredns--66bc5c9577--c82bn-eth0" Jan 14 01:04:38.119587 containerd[1695]: 2026-01-14 01:04:37.951 [INFO][4797] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="fb2000b71b00dc42be88efcddee2b9cc3d02a3e37b7619872ab05122c2f67dc9" HandleID="k8s-pod-network.fb2000b71b00dc42be88efcddee2b9cc3d02a3e37b7619872ab05122c2f67dc9" Workload="ci--4547--0--0--n--de0c74fc75-k8s-coredns--66bc5c9577--c82bn-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5010), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4547-0-0-n-de0c74fc75", "pod":"coredns-66bc5c9577-c82bn", "timestamp":"2026-01-14 01:04:37.951834516 +0000 UTC"}, Hostname:"ci-4547-0-0-n-de0c74fc75", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 14 01:04:38.119587 containerd[1695]: 2026-01-14 01:04:37.952 [INFO][4797] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 14 01:04:38.119587 containerd[1695]: 2026-01-14 01:04:37.964 [INFO][4797] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 14 01:04:38.119587 containerd[1695]: 2026-01-14 01:04:37.964 [INFO][4797] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-n-de0c74fc75' Jan 14 01:04:38.119587 containerd[1695]: 2026-01-14 01:04:38.043 [INFO][4797] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.fb2000b71b00dc42be88efcddee2b9cc3d02a3e37b7619872ab05122c2f67dc9" host="ci-4547-0-0-n-de0c74fc75" Jan 14 01:04:38.119587 containerd[1695]: 2026-01-14 01:04:38.054 [INFO][4797] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-n-de0c74fc75" Jan 14 01:04:38.119587 containerd[1695]: 2026-01-14 01:04:38.060 [INFO][4797] ipam/ipam.go 511: Trying affinity for 192.168.31.0/26 host="ci-4547-0-0-n-de0c74fc75" Jan 14 01:04:38.119587 containerd[1695]: 2026-01-14 01:04:38.065 [INFO][4797] ipam/ipam.go 158: Attempting to load block cidr=192.168.31.0/26 host="ci-4547-0-0-n-de0c74fc75" Jan 14 01:04:38.119587 containerd[1695]: 2026-01-14 01:04:38.070 [INFO][4797] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.31.0/26 host="ci-4547-0-0-n-de0c74fc75" Jan 14 01:04:38.120003 containerd[1695]: 2026-01-14 01:04:38.070 [INFO][4797] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.31.0/26 handle="k8s-pod-network.fb2000b71b00dc42be88efcddee2b9cc3d02a3e37b7619872ab05122c2f67dc9" host="ci-4547-0-0-n-de0c74fc75" Jan 14 01:04:38.120003 containerd[1695]: 2026-01-14 01:04:38.072 [INFO][4797] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.fb2000b71b00dc42be88efcddee2b9cc3d02a3e37b7619872ab05122c2f67dc9 Jan 14 01:04:38.120003 containerd[1695]: 2026-01-14 01:04:38.078 [INFO][4797] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.31.0/26 handle="k8s-pod-network.fb2000b71b00dc42be88efcddee2b9cc3d02a3e37b7619872ab05122c2f67dc9" host="ci-4547-0-0-n-de0c74fc75" Jan 14 01:04:38.120003 containerd[1695]: 2026-01-14 01:04:38.092 [INFO][4797] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.31.4/26] block=192.168.31.0/26 handle="k8s-pod-network.fb2000b71b00dc42be88efcddee2b9cc3d02a3e37b7619872ab05122c2f67dc9" host="ci-4547-0-0-n-de0c74fc75" Jan 14 01:04:38.120003 containerd[1695]: 2026-01-14 01:04:38.092 [INFO][4797] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.31.4/26] handle="k8s-pod-network.fb2000b71b00dc42be88efcddee2b9cc3d02a3e37b7619872ab05122c2f67dc9" host="ci-4547-0-0-n-de0c74fc75" Jan 14 01:04:38.120003 containerd[1695]: 2026-01-14 01:04:38.092 [INFO][4797] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 14 01:04:38.120003 containerd[1695]: 2026-01-14 01:04:38.092 [INFO][4797] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.31.4/26] IPv6=[] ContainerID="fb2000b71b00dc42be88efcddee2b9cc3d02a3e37b7619872ab05122c2f67dc9" HandleID="k8s-pod-network.fb2000b71b00dc42be88efcddee2b9cc3d02a3e37b7619872ab05122c2f67dc9" Workload="ci--4547--0--0--n--de0c74fc75-k8s-coredns--66bc5c9577--c82bn-eth0" Jan 14 01:04:38.120164 containerd[1695]: 2026-01-14 01:04:38.097 [INFO][4768] cni-plugin/k8s.go 418: Populated endpoint ContainerID="fb2000b71b00dc42be88efcddee2b9cc3d02a3e37b7619872ab05122c2f67dc9" Namespace="kube-system" Pod="coredns-66bc5c9577-c82bn" WorkloadEndpoint="ci--4547--0--0--n--de0c74fc75-k8s-coredns--66bc5c9577--c82bn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--n--de0c74fc75-k8s-coredns--66bc5c9577--c82bn-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"7f235fe5-2cf3-4556-b562-f0359308c37e", ResourceVersion:"811", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 1, 3, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-n-de0c74fc75", ContainerID:"", Pod:"coredns-66bc5c9577-c82bn", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.31.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali0f5b3d82ccc", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 01:04:38.120164 containerd[1695]: 2026-01-14 01:04:38.097 [INFO][4768] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.31.4/32] ContainerID="fb2000b71b00dc42be88efcddee2b9cc3d02a3e37b7619872ab05122c2f67dc9" Namespace="kube-system" Pod="coredns-66bc5c9577-c82bn" WorkloadEndpoint="ci--4547--0--0--n--de0c74fc75-k8s-coredns--66bc5c9577--c82bn-eth0" Jan 14 01:04:38.120164 containerd[1695]: 2026-01-14 01:04:38.097 [INFO][4768] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali0f5b3d82ccc ContainerID="fb2000b71b00dc42be88efcddee2b9cc3d02a3e37b7619872ab05122c2f67dc9" Namespace="kube-system" Pod="coredns-66bc5c9577-c82bn" WorkloadEndpoint="ci--4547--0--0--n--de0c74fc75-k8s-coredns--66bc5c9577--c82bn-eth0" Jan 14 01:04:38.120164 containerd[1695]: 2026-01-14 01:04:38.103 [INFO][4768] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="fb2000b71b00dc42be88efcddee2b9cc3d02a3e37b7619872ab05122c2f67dc9" Namespace="kube-system" Pod="coredns-66bc5c9577-c82bn" WorkloadEndpoint="ci--4547--0--0--n--de0c74fc75-k8s-coredns--66bc5c9577--c82bn-eth0" Jan 14 01:04:38.120164 containerd[1695]: 2026-01-14 01:04:38.103 [INFO][4768] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="fb2000b71b00dc42be88efcddee2b9cc3d02a3e37b7619872ab05122c2f67dc9" Namespace="kube-system" Pod="coredns-66bc5c9577-c82bn" WorkloadEndpoint="ci--4547--0--0--n--de0c74fc75-k8s-coredns--66bc5c9577--c82bn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--n--de0c74fc75-k8s-coredns--66bc5c9577--c82bn-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"7f235fe5-2cf3-4556-b562-f0359308c37e", ResourceVersion:"811", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 1, 3, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-n-de0c74fc75", ContainerID:"fb2000b71b00dc42be88efcddee2b9cc3d02a3e37b7619872ab05122c2f67dc9", Pod:"coredns-66bc5c9577-c82bn", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.31.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali0f5b3d82ccc", MAC:"36:27:db:f9:25:3e", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 01:04:38.120342 containerd[1695]: 2026-01-14 01:04:38.117 [INFO][4768] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="fb2000b71b00dc42be88efcddee2b9cc3d02a3e37b7619872ab05122c2f67dc9" Namespace="kube-system" Pod="coredns-66bc5c9577-c82bn" WorkloadEndpoint="ci--4547--0--0--n--de0c74fc75-k8s-coredns--66bc5c9577--c82bn-eth0" Jan 14 01:04:38.135000 audit[4871]: NETFILTER_CFG table=filter:126 family=2 entries=50 op=nft_register_chain pid=4871 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 14 01:04:38.135000 audit[4871]: SYSCALL arch=c000003e syscall=46 success=yes exit=24928 a0=3 a1=7ffc35d589c0 a2=0 a3=7ffc35d589ac items=0 ppid=4451 pid=4871 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:38.135000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 14 01:04:38.150713 containerd[1695]: time="2026-01-14T01:04:38.150661348Z" level=info msg="connecting to shim fb2000b71b00dc42be88efcddee2b9cc3d02a3e37b7619872ab05122c2f67dc9" address="unix:///run/containerd/s/02aa72f53ee22cbf0368cc6ba28d28be307ddbd26bb4b87c600cf5b15f7a6234" namespace=k8s.io protocol=ttrpc version=3 Jan 14 01:04:38.170221 systemd[1]: Started cri-containerd-fb2000b71b00dc42be88efcddee2b9cc3d02a3e37b7619872ab05122c2f67dc9.scope - libcontainer container fb2000b71b00dc42be88efcddee2b9cc3d02a3e37b7619872ab05122c2f67dc9. Jan 14 01:04:38.177000 audit: BPF prog-id=225 op=LOAD Jan 14 01:04:38.177000 audit: BPF prog-id=226 op=LOAD Jan 14 01:04:38.177000 audit[4891]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001a0238 a2=98 a3=0 items=0 ppid=4880 pid=4891 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:38.177000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6662323030306237316230306463343262653838656663646465653262 Jan 14 01:04:38.178000 audit: BPF prog-id=226 op=UNLOAD Jan 14 01:04:38.178000 audit[4891]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4880 pid=4891 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:38.178000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6662323030306237316230306463343262653838656663646465653262 Jan 14 01:04:38.178000 audit: BPF prog-id=227 op=LOAD Jan 14 01:04:38.178000 audit[4891]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001a0488 a2=98 a3=0 items=0 ppid=4880 pid=4891 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:38.178000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6662323030306237316230306463343262653838656663646465653262 Jan 14 01:04:38.178000 audit: BPF prog-id=228 op=LOAD Jan 14 01:04:38.178000 audit[4891]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c0001a0218 a2=98 a3=0 items=0 ppid=4880 pid=4891 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:38.178000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6662323030306237316230306463343262653838656663646465653262 Jan 14 01:04:38.178000 audit: BPF prog-id=228 op=UNLOAD Jan 14 01:04:38.178000 audit[4891]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=4880 pid=4891 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:38.178000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6662323030306237316230306463343262653838656663646465653262 Jan 14 01:04:38.178000 audit: BPF prog-id=227 op=UNLOAD Jan 14 01:04:38.178000 audit[4891]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4880 pid=4891 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:38.178000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6662323030306237316230306463343262653838656663646465653262 Jan 14 01:04:38.178000 audit: BPF prog-id=229 op=LOAD Jan 14 01:04:38.178000 audit[4891]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001a06e8 a2=98 a3=0 items=0 ppid=4880 pid=4891 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:38.178000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6662323030306237316230306463343262653838656663646465653262 Jan 14 01:04:38.211058 containerd[1695]: time="2026-01-14T01:04:38.211019673Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-c82bn,Uid:7f235fe5-2cf3-4556-b562-f0359308c37e,Namespace:kube-system,Attempt:0,} returns sandbox id \"fb2000b71b00dc42be88efcddee2b9cc3d02a3e37b7619872ab05122c2f67dc9\"" Jan 14 01:04:38.217368 containerd[1695]: time="2026-01-14T01:04:38.217290701Z" level=info msg="CreateContainer within sandbox \"fb2000b71b00dc42be88efcddee2b9cc3d02a3e37b7619872ab05122c2f67dc9\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 14 01:04:38.228268 systemd-networkd[1385]: calieb5108feda8: Gained IPv6LL Jan 14 01:04:38.243406 containerd[1695]: time="2026-01-14T01:04:38.243365322Z" level=info msg="Container 37ea5ba7f26074172f69479388d79b76fce8ac916e6b72b62dc5d43aea2a04dd: CDI devices from CRI Config.CDIDevices: []" Jan 14 01:04:38.251502 containerd[1695]: time="2026-01-14T01:04:38.251317230Z" level=info msg="CreateContainer within sandbox \"fb2000b71b00dc42be88efcddee2b9cc3d02a3e37b7619872ab05122c2f67dc9\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"37ea5ba7f26074172f69479388d79b76fce8ac916e6b72b62dc5d43aea2a04dd\"" Jan 14 01:04:38.251999 containerd[1695]: time="2026-01-14T01:04:38.251981902Z" level=info msg="StartContainer for \"37ea5ba7f26074172f69479388d79b76fce8ac916e6b72b62dc5d43aea2a04dd\"" Jan 14 01:04:38.252746 containerd[1695]: time="2026-01-14T01:04:38.252720095Z" level=info msg="connecting to shim 37ea5ba7f26074172f69479388d79b76fce8ac916e6b72b62dc5d43aea2a04dd" address="unix:///run/containerd/s/02aa72f53ee22cbf0368cc6ba28d28be307ddbd26bb4b87c600cf5b15f7a6234" protocol=ttrpc version=3 Jan 14 01:04:38.268518 systemd[1]: Started cri-containerd-37ea5ba7f26074172f69479388d79b76fce8ac916e6b72b62dc5d43aea2a04dd.scope - libcontainer container 37ea5ba7f26074172f69479388d79b76fce8ac916e6b72b62dc5d43aea2a04dd. Jan 14 01:04:38.276000 audit: BPF prog-id=230 op=LOAD Jan 14 01:04:38.277000 audit: BPF prog-id=231 op=LOAD Jan 14 01:04:38.277000 audit[4917]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=4880 pid=4917 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:38.277000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3337656135626137663236303734313732663639343739333838643739 Jan 14 01:04:38.277000 audit: BPF prog-id=231 op=UNLOAD Jan 14 01:04:38.277000 audit[4917]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4880 pid=4917 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:38.277000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3337656135626137663236303734313732663639343739333838643739 Jan 14 01:04:38.277000 audit: BPF prog-id=232 op=LOAD Jan 14 01:04:38.277000 audit[4917]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=4880 pid=4917 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:38.277000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3337656135626137663236303734313732663639343739333838643739 Jan 14 01:04:38.277000 audit: BPF prog-id=233 op=LOAD Jan 14 01:04:38.277000 audit[4917]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=4880 pid=4917 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:38.277000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3337656135626137663236303734313732663639343739333838643739 Jan 14 01:04:38.277000 audit: BPF prog-id=233 op=UNLOAD Jan 14 01:04:38.277000 audit[4917]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4880 pid=4917 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:38.277000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3337656135626137663236303734313732663639343739333838643739 Jan 14 01:04:38.277000 audit: BPF prog-id=232 op=UNLOAD Jan 14 01:04:38.277000 audit[4917]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4880 pid=4917 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:38.277000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3337656135626137663236303734313732663639343739333838643739 Jan 14 01:04:38.277000 audit: BPF prog-id=234 op=LOAD Jan 14 01:04:38.277000 audit[4917]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=4880 pid=4917 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:38.277000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3337656135626137663236303734313732663639343739333838643739 Jan 14 01:04:38.291189 containerd[1695]: time="2026-01-14T01:04:38.291154165Z" level=info msg="StartContainer for \"37ea5ba7f26074172f69479388d79b76fce8ac916e6b72b62dc5d43aea2a04dd\" returns successfully" Jan 14 01:04:38.437916 containerd[1695]: time="2026-01-14T01:04:38.437758453Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:04:38.440327 containerd[1695]: time="2026-01-14T01:04:38.440295804Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 14 01:04:38.440727 containerd[1695]: time="2026-01-14T01:04:38.440696421Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 14 01:04:38.440980 kubelet[3316]: E0114 01:04:38.440918 3316 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 14 01:04:38.440980 kubelet[3316]: E0114 01:04:38.440955 3316 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 14 01:04:38.441245 kubelet[3316]: E0114 01:04:38.441157 3316 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-gdh9l_calico-system(4395ad87-346f-47f3-8e06-f63944f13a5d): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 14 01:04:38.442596 containerd[1695]: time="2026-01-14T01:04:38.442128824Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 14 01:04:38.613597 systemd-networkd[1385]: cali4d8537906b1: Gained IPv6LL Jan 14 01:04:38.768205 containerd[1695]: time="2026-01-14T01:04:38.767870053Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:04:38.769840 containerd[1695]: time="2026-01-14T01:04:38.769810318Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 14 01:04:38.769920 containerd[1695]: time="2026-01-14T01:04:38.769818954Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 14 01:04:38.770094 kubelet[3316]: E0114 01:04:38.770069 3316 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 14 01:04:38.770156 kubelet[3316]: E0114 01:04:38.770105 3316 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 14 01:04:38.770187 kubelet[3316]: E0114 01:04:38.770164 3316 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-gdh9l_calico-system(4395ad87-346f-47f3-8e06-f63944f13a5d): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 14 01:04:38.770210 kubelet[3316]: E0114 01:04:38.770196 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-gdh9l" podUID="4395ad87-346f-47f3-8e06-f63944f13a5d" Jan 14 01:04:39.015385 kubelet[3316]: E0114 01:04:39.015069 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-gdh9l" podUID="4395ad87-346f-47f3-8e06-f63944f13a5d" Jan 14 01:04:39.017194 kubelet[3316]: E0114 01:04:39.017135 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-86656bdf75-c9kjr" podUID="81b57aeb-9645-4cab-a7a2-931a98fd5ce6" Jan 14 01:04:39.069631 kubelet[3316]: I0114 01:04:39.069580 3316 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-66bc5c9577-c82bn" podStartSLOduration=40.069565651 podStartE2EDuration="40.069565651s" podCreationTimestamp="2026-01-14 01:03:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-14 01:04:39.044322022 +0000 UTC m=+45.295985329" watchObservedRunningTime="2026-01-14 01:04:39.069565651 +0000 UTC m=+45.321228959" Jan 14 01:04:39.116000 audit[4949]: NETFILTER_CFG table=filter:127 family=2 entries=17 op=nft_register_rule pid=4949 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:04:39.116000 audit[4949]: SYSCALL arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7ffd7439b8d0 a2=0 a3=7ffd7439b8bc items=0 ppid=3467 pid=4949 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:39.116000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:04:39.121000 audit[4949]: NETFILTER_CFG table=nat:128 family=2 entries=35 op=nft_register_chain pid=4949 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:04:39.121000 audit[4949]: SYSCALL arch=c000003e syscall=46 success=yes exit=14196 a0=3 a1=7ffd7439b8d0 a2=0 a3=7ffd7439b8bc items=0 ppid=3467 pid=4949 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:39.121000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:04:39.380598 systemd-networkd[1385]: cali0f5b3d82ccc: Gained IPv6LL Jan 14 01:04:39.380834 systemd-networkd[1385]: calida71fc46531: Gained IPv6LL Jan 14 01:04:39.860506 containerd[1695]: time="2026-01-14T01:04:39.860473717Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7c778bb748-vc7sq,Uid:95ab78ae-ca97-4cab-9490-03b0a50f740c,Namespace:calico-system,Attempt:0,}" Jan 14 01:04:39.863345 containerd[1695]: time="2026-01-14T01:04:39.863319593Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-d9j8s,Uid:6dd4e5be-52fe-4768-ad9c-ffbe85d93102,Namespace:kube-system,Attempt:0,}" Jan 14 01:04:39.967524 systemd-networkd[1385]: calia83c40726fe: Link UP Jan 14 01:04:39.967654 systemd-networkd[1385]: calia83c40726fe: Gained carrier Jan 14 01:04:39.986044 containerd[1695]: 2026-01-14 01:04:39.910 [INFO][4961] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--n--de0c74fc75-k8s-coredns--66bc5c9577--d9j8s-eth0 coredns-66bc5c9577- kube-system 6dd4e5be-52fe-4768-ad9c-ffbe85d93102 810 0 2026-01-14 01:03:59 +0000 UTC map[k8s-app:kube-dns pod-template-hash:66bc5c9577 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4547-0-0-n-de0c74fc75 coredns-66bc5c9577-d9j8s eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calia83c40726fe [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="26a9c1bb5fb0ccf136e278e6817ec21be039038d97053415ccdba63eab599718" Namespace="kube-system" Pod="coredns-66bc5c9577-d9j8s" WorkloadEndpoint="ci--4547--0--0--n--de0c74fc75-k8s-coredns--66bc5c9577--d9j8s-" Jan 14 01:04:39.986044 containerd[1695]: 2026-01-14 01:04:39.910 [INFO][4961] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="26a9c1bb5fb0ccf136e278e6817ec21be039038d97053415ccdba63eab599718" Namespace="kube-system" Pod="coredns-66bc5c9577-d9j8s" WorkloadEndpoint="ci--4547--0--0--n--de0c74fc75-k8s-coredns--66bc5c9577--d9j8s-eth0" Jan 14 01:04:39.986044 containerd[1695]: 2026-01-14 01:04:39.934 [INFO][4978] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="26a9c1bb5fb0ccf136e278e6817ec21be039038d97053415ccdba63eab599718" HandleID="k8s-pod-network.26a9c1bb5fb0ccf136e278e6817ec21be039038d97053415ccdba63eab599718" Workload="ci--4547--0--0--n--de0c74fc75-k8s-coredns--66bc5c9577--d9j8s-eth0" Jan 14 01:04:39.986044 containerd[1695]: 2026-01-14 01:04:39.935 [INFO][4978] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="26a9c1bb5fb0ccf136e278e6817ec21be039038d97053415ccdba63eab599718" HandleID="k8s-pod-network.26a9c1bb5fb0ccf136e278e6817ec21be039038d97053415ccdba63eab599718" Workload="ci--4547--0--0--n--de0c74fc75-k8s-coredns--66bc5c9577--d9j8s-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d50f0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4547-0-0-n-de0c74fc75", "pod":"coredns-66bc5c9577-d9j8s", "timestamp":"2026-01-14 01:04:39.93489663 +0000 UTC"}, Hostname:"ci-4547-0-0-n-de0c74fc75", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 14 01:04:39.986044 containerd[1695]: 2026-01-14 01:04:39.935 [INFO][4978] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 14 01:04:39.986044 containerd[1695]: 2026-01-14 01:04:39.935 [INFO][4978] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 14 01:04:39.986044 containerd[1695]: 2026-01-14 01:04:39.935 [INFO][4978] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-n-de0c74fc75' Jan 14 01:04:39.986044 containerd[1695]: 2026-01-14 01:04:39.941 [INFO][4978] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.26a9c1bb5fb0ccf136e278e6817ec21be039038d97053415ccdba63eab599718" host="ci-4547-0-0-n-de0c74fc75" Jan 14 01:04:39.986044 containerd[1695]: 2026-01-14 01:04:39.944 [INFO][4978] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-n-de0c74fc75" Jan 14 01:04:39.986044 containerd[1695]: 2026-01-14 01:04:39.948 [INFO][4978] ipam/ipam.go 511: Trying affinity for 192.168.31.0/26 host="ci-4547-0-0-n-de0c74fc75" Jan 14 01:04:39.986044 containerd[1695]: 2026-01-14 01:04:39.949 [INFO][4978] ipam/ipam.go 158: Attempting to load block cidr=192.168.31.0/26 host="ci-4547-0-0-n-de0c74fc75" Jan 14 01:04:39.986044 containerd[1695]: 2026-01-14 01:04:39.951 [INFO][4978] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.31.0/26 host="ci-4547-0-0-n-de0c74fc75" Jan 14 01:04:39.986044 containerd[1695]: 2026-01-14 01:04:39.951 [INFO][4978] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.31.0/26 handle="k8s-pod-network.26a9c1bb5fb0ccf136e278e6817ec21be039038d97053415ccdba63eab599718" host="ci-4547-0-0-n-de0c74fc75" Jan 14 01:04:39.986044 containerd[1695]: 2026-01-14 01:04:39.952 [INFO][4978] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.26a9c1bb5fb0ccf136e278e6817ec21be039038d97053415ccdba63eab599718 Jan 14 01:04:39.986044 containerd[1695]: 2026-01-14 01:04:39.957 [INFO][4978] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.31.0/26 handle="k8s-pod-network.26a9c1bb5fb0ccf136e278e6817ec21be039038d97053415ccdba63eab599718" host="ci-4547-0-0-n-de0c74fc75" Jan 14 01:04:39.986044 containerd[1695]: 2026-01-14 01:04:39.961 [INFO][4978] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.31.5/26] block=192.168.31.0/26 handle="k8s-pod-network.26a9c1bb5fb0ccf136e278e6817ec21be039038d97053415ccdba63eab599718" host="ci-4547-0-0-n-de0c74fc75" Jan 14 01:04:39.986044 containerd[1695]: 2026-01-14 01:04:39.961 [INFO][4978] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.31.5/26] handle="k8s-pod-network.26a9c1bb5fb0ccf136e278e6817ec21be039038d97053415ccdba63eab599718" host="ci-4547-0-0-n-de0c74fc75" Jan 14 01:04:39.986044 containerd[1695]: 2026-01-14 01:04:39.962 [INFO][4978] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 14 01:04:39.986044 containerd[1695]: 2026-01-14 01:04:39.962 [INFO][4978] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.31.5/26] IPv6=[] ContainerID="26a9c1bb5fb0ccf136e278e6817ec21be039038d97053415ccdba63eab599718" HandleID="k8s-pod-network.26a9c1bb5fb0ccf136e278e6817ec21be039038d97053415ccdba63eab599718" Workload="ci--4547--0--0--n--de0c74fc75-k8s-coredns--66bc5c9577--d9j8s-eth0" Jan 14 01:04:39.986589 containerd[1695]: 2026-01-14 01:04:39.963 [INFO][4961] cni-plugin/k8s.go 418: Populated endpoint ContainerID="26a9c1bb5fb0ccf136e278e6817ec21be039038d97053415ccdba63eab599718" Namespace="kube-system" Pod="coredns-66bc5c9577-d9j8s" WorkloadEndpoint="ci--4547--0--0--n--de0c74fc75-k8s-coredns--66bc5c9577--d9j8s-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--n--de0c74fc75-k8s-coredns--66bc5c9577--d9j8s-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"6dd4e5be-52fe-4768-ad9c-ffbe85d93102", ResourceVersion:"810", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 1, 3, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-n-de0c74fc75", ContainerID:"", Pod:"coredns-66bc5c9577-d9j8s", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.31.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calia83c40726fe", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 01:04:39.986589 containerd[1695]: 2026-01-14 01:04:39.963 [INFO][4961] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.31.5/32] ContainerID="26a9c1bb5fb0ccf136e278e6817ec21be039038d97053415ccdba63eab599718" Namespace="kube-system" Pod="coredns-66bc5c9577-d9j8s" WorkloadEndpoint="ci--4547--0--0--n--de0c74fc75-k8s-coredns--66bc5c9577--d9j8s-eth0" Jan 14 01:04:39.986589 containerd[1695]: 2026-01-14 01:04:39.963 [INFO][4961] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia83c40726fe ContainerID="26a9c1bb5fb0ccf136e278e6817ec21be039038d97053415ccdba63eab599718" Namespace="kube-system" Pod="coredns-66bc5c9577-d9j8s" WorkloadEndpoint="ci--4547--0--0--n--de0c74fc75-k8s-coredns--66bc5c9577--d9j8s-eth0" Jan 14 01:04:39.986589 containerd[1695]: 2026-01-14 01:04:39.967 [INFO][4961] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="26a9c1bb5fb0ccf136e278e6817ec21be039038d97053415ccdba63eab599718" Namespace="kube-system" Pod="coredns-66bc5c9577-d9j8s" WorkloadEndpoint="ci--4547--0--0--n--de0c74fc75-k8s-coredns--66bc5c9577--d9j8s-eth0" Jan 14 01:04:39.986589 containerd[1695]: 2026-01-14 01:04:39.968 [INFO][4961] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="26a9c1bb5fb0ccf136e278e6817ec21be039038d97053415ccdba63eab599718" Namespace="kube-system" Pod="coredns-66bc5c9577-d9j8s" WorkloadEndpoint="ci--4547--0--0--n--de0c74fc75-k8s-coredns--66bc5c9577--d9j8s-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--n--de0c74fc75-k8s-coredns--66bc5c9577--d9j8s-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"6dd4e5be-52fe-4768-ad9c-ffbe85d93102", ResourceVersion:"810", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 1, 3, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-n-de0c74fc75", ContainerID:"26a9c1bb5fb0ccf136e278e6817ec21be039038d97053415ccdba63eab599718", Pod:"coredns-66bc5c9577-d9j8s", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.31.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calia83c40726fe", MAC:"62:4e:ec:27:f3:7a", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 01:04:39.986759 containerd[1695]: 2026-01-14 01:04:39.983 [INFO][4961] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="26a9c1bb5fb0ccf136e278e6817ec21be039038d97053415ccdba63eab599718" Namespace="kube-system" Pod="coredns-66bc5c9577-d9j8s" WorkloadEndpoint="ci--4547--0--0--n--de0c74fc75-k8s-coredns--66bc5c9577--d9j8s-eth0" Jan 14 01:04:39.997000 audit[4998]: NETFILTER_CFG table=filter:129 family=2 entries=44 op=nft_register_chain pid=4998 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 14 01:04:39.997000 audit[4998]: SYSCALL arch=c000003e syscall=46 success=yes exit=21532 a0=3 a1=7fff5834f030 a2=0 a3=7fff5834f01c items=0 ppid=4451 pid=4998 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:39.997000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 14 01:04:40.015814 containerd[1695]: time="2026-01-14T01:04:40.015776716Z" level=info msg="connecting to shim 26a9c1bb5fb0ccf136e278e6817ec21be039038d97053415ccdba63eab599718" address="unix:///run/containerd/s/5ffb08a4b0fd0b8e7fc2c3338cad1b6f99672b560cb8b3e05127e89a94313b40" namespace=k8s.io protocol=ttrpc version=3 Jan 14 01:04:40.020225 kubelet[3316]: E0114 01:04:40.020180 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-gdh9l" podUID="4395ad87-346f-47f3-8e06-f63944f13a5d" Jan 14 01:04:40.056349 systemd[1]: Started cri-containerd-26a9c1bb5fb0ccf136e278e6817ec21be039038d97053415ccdba63eab599718.scope - libcontainer container 26a9c1bb5fb0ccf136e278e6817ec21be039038d97053415ccdba63eab599718. Jan 14 01:04:40.073000 audit: BPF prog-id=235 op=LOAD Jan 14 01:04:40.074000 audit: BPF prog-id=236 op=LOAD Jan 14 01:04:40.074000 audit[5019]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=5008 pid=5019 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:40.074000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3236613963316262356662306363663133366532373865363831376563 Jan 14 01:04:40.074000 audit: BPF prog-id=236 op=UNLOAD Jan 14 01:04:40.074000 audit[5019]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=5008 pid=5019 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:40.074000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3236613963316262356662306363663133366532373865363831376563 Jan 14 01:04:40.074000 audit: BPF prog-id=237 op=LOAD Jan 14 01:04:40.074000 audit[5019]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=5008 pid=5019 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:40.074000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3236613963316262356662306363663133366532373865363831376563 Jan 14 01:04:40.074000 audit: BPF prog-id=238 op=LOAD Jan 14 01:04:40.074000 audit[5019]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=5008 pid=5019 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:40.074000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3236613963316262356662306363663133366532373865363831376563 Jan 14 01:04:40.074000 audit: BPF prog-id=238 op=UNLOAD Jan 14 01:04:40.074000 audit[5019]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=5008 pid=5019 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:40.074000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3236613963316262356662306363663133366532373865363831376563 Jan 14 01:04:40.074000 audit: BPF prog-id=237 op=UNLOAD Jan 14 01:04:40.074000 audit[5019]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=5008 pid=5019 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:40.074000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3236613963316262356662306363663133366532373865363831376563 Jan 14 01:04:40.074000 audit: BPF prog-id=239 op=LOAD Jan 14 01:04:40.074000 audit[5019]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=5008 pid=5019 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:40.074000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3236613963316262356662306363663133366532373865363831376563 Jan 14 01:04:40.085408 systemd-networkd[1385]: cali4679894d7d6: Link UP Jan 14 01:04:40.088937 systemd-networkd[1385]: cali4679894d7d6: Gained carrier Jan 14 01:04:40.111098 containerd[1695]: 2026-01-14 01:04:39.906 [INFO][4950] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--n--de0c74fc75-k8s-goldmane--7c778bb748--vc7sq-eth0 goldmane-7c778bb748- calico-system 95ab78ae-ca97-4cab-9490-03b0a50f740c 809 0 2026-01-14 01:04:10 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:7c778bb748 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4547-0-0-n-de0c74fc75 goldmane-7c778bb748-vc7sq eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali4679894d7d6 [] [] }} ContainerID="9efcddebf2bffb15fb476b3192bfdddd92c2af61191b4d5a0ca3ee761d77d10e" Namespace="calico-system" Pod="goldmane-7c778bb748-vc7sq" WorkloadEndpoint="ci--4547--0--0--n--de0c74fc75-k8s-goldmane--7c778bb748--vc7sq-" Jan 14 01:04:40.111098 containerd[1695]: 2026-01-14 01:04:39.907 [INFO][4950] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="9efcddebf2bffb15fb476b3192bfdddd92c2af61191b4d5a0ca3ee761d77d10e" Namespace="calico-system" Pod="goldmane-7c778bb748-vc7sq" WorkloadEndpoint="ci--4547--0--0--n--de0c74fc75-k8s-goldmane--7c778bb748--vc7sq-eth0" Jan 14 01:04:40.111098 containerd[1695]: 2026-01-14 01:04:39.944 [INFO][4976] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9efcddebf2bffb15fb476b3192bfdddd92c2af61191b4d5a0ca3ee761d77d10e" HandleID="k8s-pod-network.9efcddebf2bffb15fb476b3192bfdddd92c2af61191b4d5a0ca3ee761d77d10e" Workload="ci--4547--0--0--n--de0c74fc75-k8s-goldmane--7c778bb748--vc7sq-eth0" Jan 14 01:04:40.111098 containerd[1695]: 2026-01-14 01:04:39.944 [INFO][4976] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="9efcddebf2bffb15fb476b3192bfdddd92c2af61191b4d5a0ca3ee761d77d10e" HandleID="k8s-pod-network.9efcddebf2bffb15fb476b3192bfdddd92c2af61191b4d5a0ca3ee761d77d10e" Workload="ci--4547--0--0--n--de0c74fc75-k8s-goldmane--7c778bb748--vc7sq-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5090), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4547-0-0-n-de0c74fc75", "pod":"goldmane-7c778bb748-vc7sq", "timestamp":"2026-01-14 01:04:39.944558476 +0000 UTC"}, Hostname:"ci-4547-0-0-n-de0c74fc75", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 14 01:04:40.111098 containerd[1695]: 2026-01-14 01:04:39.944 [INFO][4976] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 14 01:04:40.111098 containerd[1695]: 2026-01-14 01:04:39.962 [INFO][4976] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 14 01:04:40.111098 containerd[1695]: 2026-01-14 01:04:39.962 [INFO][4976] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-n-de0c74fc75' Jan 14 01:04:40.111098 containerd[1695]: 2026-01-14 01:04:40.041 [INFO][4976] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.9efcddebf2bffb15fb476b3192bfdddd92c2af61191b4d5a0ca3ee761d77d10e" host="ci-4547-0-0-n-de0c74fc75" Jan 14 01:04:40.111098 containerd[1695]: 2026-01-14 01:04:40.048 [INFO][4976] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-n-de0c74fc75" Jan 14 01:04:40.111098 containerd[1695]: 2026-01-14 01:04:40.056 [INFO][4976] ipam/ipam.go 511: Trying affinity for 192.168.31.0/26 host="ci-4547-0-0-n-de0c74fc75" Jan 14 01:04:40.111098 containerd[1695]: 2026-01-14 01:04:40.060 [INFO][4976] ipam/ipam.go 158: Attempting to load block cidr=192.168.31.0/26 host="ci-4547-0-0-n-de0c74fc75" Jan 14 01:04:40.111098 containerd[1695]: 2026-01-14 01:04:40.064 [INFO][4976] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.31.0/26 host="ci-4547-0-0-n-de0c74fc75" Jan 14 01:04:40.111098 containerd[1695]: 2026-01-14 01:04:40.064 [INFO][4976] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.31.0/26 handle="k8s-pod-network.9efcddebf2bffb15fb476b3192bfdddd92c2af61191b4d5a0ca3ee761d77d10e" host="ci-4547-0-0-n-de0c74fc75" Jan 14 01:04:40.111098 containerd[1695]: 2026-01-14 01:04:40.066 [INFO][4976] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.9efcddebf2bffb15fb476b3192bfdddd92c2af61191b4d5a0ca3ee761d77d10e Jan 14 01:04:40.111098 containerd[1695]: 2026-01-14 01:04:40.070 [INFO][4976] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.31.0/26 handle="k8s-pod-network.9efcddebf2bffb15fb476b3192bfdddd92c2af61191b4d5a0ca3ee761d77d10e" host="ci-4547-0-0-n-de0c74fc75" Jan 14 01:04:40.111098 containerd[1695]: 2026-01-14 01:04:40.078 [INFO][4976] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.31.6/26] block=192.168.31.0/26 handle="k8s-pod-network.9efcddebf2bffb15fb476b3192bfdddd92c2af61191b4d5a0ca3ee761d77d10e" host="ci-4547-0-0-n-de0c74fc75" Jan 14 01:04:40.111098 containerd[1695]: 2026-01-14 01:04:40.079 [INFO][4976] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.31.6/26] handle="k8s-pod-network.9efcddebf2bffb15fb476b3192bfdddd92c2af61191b4d5a0ca3ee761d77d10e" host="ci-4547-0-0-n-de0c74fc75" Jan 14 01:04:40.111098 containerd[1695]: 2026-01-14 01:04:40.079 [INFO][4976] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 14 01:04:40.111098 containerd[1695]: 2026-01-14 01:04:40.079 [INFO][4976] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.31.6/26] IPv6=[] ContainerID="9efcddebf2bffb15fb476b3192bfdddd92c2af61191b4d5a0ca3ee761d77d10e" HandleID="k8s-pod-network.9efcddebf2bffb15fb476b3192bfdddd92c2af61191b4d5a0ca3ee761d77d10e" Workload="ci--4547--0--0--n--de0c74fc75-k8s-goldmane--7c778bb748--vc7sq-eth0" Jan 14 01:04:40.111589 containerd[1695]: 2026-01-14 01:04:40.081 [INFO][4950] cni-plugin/k8s.go 418: Populated endpoint ContainerID="9efcddebf2bffb15fb476b3192bfdddd92c2af61191b4d5a0ca3ee761d77d10e" Namespace="calico-system" Pod="goldmane-7c778bb748-vc7sq" WorkloadEndpoint="ci--4547--0--0--n--de0c74fc75-k8s-goldmane--7c778bb748--vc7sq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--n--de0c74fc75-k8s-goldmane--7c778bb748--vc7sq-eth0", GenerateName:"goldmane-7c778bb748-", Namespace:"calico-system", SelfLink:"", UID:"95ab78ae-ca97-4cab-9490-03b0a50f740c", ResourceVersion:"809", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 1, 4, 10, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7c778bb748", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-n-de0c74fc75", ContainerID:"", Pod:"goldmane-7c778bb748-vc7sq", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.31.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali4679894d7d6", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 01:04:40.111589 containerd[1695]: 2026-01-14 01:04:40.081 [INFO][4950] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.31.6/32] ContainerID="9efcddebf2bffb15fb476b3192bfdddd92c2af61191b4d5a0ca3ee761d77d10e" Namespace="calico-system" Pod="goldmane-7c778bb748-vc7sq" WorkloadEndpoint="ci--4547--0--0--n--de0c74fc75-k8s-goldmane--7c778bb748--vc7sq-eth0" Jan 14 01:04:40.111589 containerd[1695]: 2026-01-14 01:04:40.081 [INFO][4950] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali4679894d7d6 ContainerID="9efcddebf2bffb15fb476b3192bfdddd92c2af61191b4d5a0ca3ee761d77d10e" Namespace="calico-system" Pod="goldmane-7c778bb748-vc7sq" WorkloadEndpoint="ci--4547--0--0--n--de0c74fc75-k8s-goldmane--7c778bb748--vc7sq-eth0" Jan 14 01:04:40.111589 containerd[1695]: 2026-01-14 01:04:40.088 [INFO][4950] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="9efcddebf2bffb15fb476b3192bfdddd92c2af61191b4d5a0ca3ee761d77d10e" Namespace="calico-system" Pod="goldmane-7c778bb748-vc7sq" WorkloadEndpoint="ci--4547--0--0--n--de0c74fc75-k8s-goldmane--7c778bb748--vc7sq-eth0" Jan 14 01:04:40.111589 containerd[1695]: 2026-01-14 01:04:40.089 [INFO][4950] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="9efcddebf2bffb15fb476b3192bfdddd92c2af61191b4d5a0ca3ee761d77d10e" Namespace="calico-system" Pod="goldmane-7c778bb748-vc7sq" WorkloadEndpoint="ci--4547--0--0--n--de0c74fc75-k8s-goldmane--7c778bb748--vc7sq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--n--de0c74fc75-k8s-goldmane--7c778bb748--vc7sq-eth0", GenerateName:"goldmane-7c778bb748-", Namespace:"calico-system", SelfLink:"", UID:"95ab78ae-ca97-4cab-9490-03b0a50f740c", ResourceVersion:"809", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 1, 4, 10, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7c778bb748", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-n-de0c74fc75", ContainerID:"9efcddebf2bffb15fb476b3192bfdddd92c2af61191b4d5a0ca3ee761d77d10e", Pod:"goldmane-7c778bb748-vc7sq", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.31.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali4679894d7d6", MAC:"ae:7e:97:ee:9e:7a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 01:04:40.111589 containerd[1695]: 2026-01-14 01:04:40.105 [INFO][4950] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="9efcddebf2bffb15fb476b3192bfdddd92c2af61191b4d5a0ca3ee761d77d10e" Namespace="calico-system" Pod="goldmane-7c778bb748-vc7sq" WorkloadEndpoint="ci--4547--0--0--n--de0c74fc75-k8s-goldmane--7c778bb748--vc7sq-eth0" Jan 14 01:04:40.127672 containerd[1695]: time="2026-01-14T01:04:40.127628096Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-d9j8s,Uid:6dd4e5be-52fe-4768-ad9c-ffbe85d93102,Namespace:kube-system,Attempt:0,} returns sandbox id \"26a9c1bb5fb0ccf136e278e6817ec21be039038d97053415ccdba63eab599718\"" Jan 14 01:04:40.130000 audit[5054]: NETFILTER_CFG table=filter:130 family=2 entries=66 op=nft_register_chain pid=5054 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 14 01:04:40.130000 audit[5054]: SYSCALL arch=c000003e syscall=46 success=yes exit=32784 a0=3 a1=7fffed69a940 a2=0 a3=7fffed69a92c items=0 ppid=4451 pid=5054 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:40.130000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 14 01:04:40.133760 containerd[1695]: time="2026-01-14T01:04:40.133736186Z" level=info msg="CreateContainer within sandbox \"26a9c1bb5fb0ccf136e278e6817ec21be039038d97053415ccdba63eab599718\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 14 01:04:40.153286 containerd[1695]: time="2026-01-14T01:04:40.153109979Z" level=info msg="Container 20b1df70efe84c6f53d181488017682733d715e52904017e5dc38cce81ee1074: CDI devices from CRI Config.CDIDevices: []" Jan 14 01:04:40.154172 containerd[1695]: time="2026-01-14T01:04:40.154147102Z" level=info msg="connecting to shim 9efcddebf2bffb15fb476b3192bfdddd92c2af61191b4d5a0ca3ee761d77d10e" address="unix:///run/containerd/s/baeecd9d069be7bbf01640f82a0fb6ce55141335db5d810d5341f4d5b342e627" namespace=k8s.io protocol=ttrpc version=3 Jan 14 01:04:40.168419 containerd[1695]: time="2026-01-14T01:04:40.167768530Z" level=info msg="CreateContainer within sandbox \"26a9c1bb5fb0ccf136e278e6817ec21be039038d97053415ccdba63eab599718\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"20b1df70efe84c6f53d181488017682733d715e52904017e5dc38cce81ee1074\"" Jan 14 01:04:40.170428 containerd[1695]: time="2026-01-14T01:04:40.170408556Z" level=info msg="StartContainer for \"20b1df70efe84c6f53d181488017682733d715e52904017e5dc38cce81ee1074\"" Jan 14 01:04:40.172933 containerd[1695]: time="2026-01-14T01:04:40.172899724Z" level=info msg="connecting to shim 20b1df70efe84c6f53d181488017682733d715e52904017e5dc38cce81ee1074" address="unix:///run/containerd/s/5ffb08a4b0fd0b8e7fc2c3338cad1b6f99672b560cb8b3e05127e89a94313b40" protocol=ttrpc version=3 Jan 14 01:04:40.189353 systemd[1]: Started cri-containerd-9efcddebf2bffb15fb476b3192bfdddd92c2af61191b4d5a0ca3ee761d77d10e.scope - libcontainer container 9efcddebf2bffb15fb476b3192bfdddd92c2af61191b4d5a0ca3ee761d77d10e. Jan 14 01:04:40.196168 systemd[1]: Started cri-containerd-20b1df70efe84c6f53d181488017682733d715e52904017e5dc38cce81ee1074.scope - libcontainer container 20b1df70efe84c6f53d181488017682733d715e52904017e5dc38cce81ee1074. Jan 14 01:04:40.206000 audit: BPF prog-id=240 op=LOAD Jan 14 01:04:40.207000 audit: BPF prog-id=241 op=LOAD Jan 14 01:04:40.207000 audit[5076]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b0238 a2=98 a3=0 items=0 ppid=5065 pid=5076 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:40.207000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3965666364646562663262666662313566623437366233313932626664 Jan 14 01:04:40.207000 audit: BPF prog-id=241 op=UNLOAD Jan 14 01:04:40.207000 audit[5076]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=5065 pid=5076 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:40.207000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3965666364646562663262666662313566623437366233313932626664 Jan 14 01:04:40.207000 audit: BPF prog-id=242 op=LOAD Jan 14 01:04:40.207000 audit[5076]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b0488 a2=98 a3=0 items=0 ppid=5065 pid=5076 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:40.207000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3965666364646562663262666662313566623437366233313932626664 Jan 14 01:04:40.207000 audit: BPF prog-id=243 op=LOAD Jan 14 01:04:40.207000 audit[5076]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c0001b0218 a2=98 a3=0 items=0 ppid=5065 pid=5076 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:40.207000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3965666364646562663262666662313566623437366233313932626664 Jan 14 01:04:40.207000 audit: BPF prog-id=243 op=UNLOAD Jan 14 01:04:40.207000 audit[5076]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=5065 pid=5076 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:40.207000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3965666364646562663262666662313566623437366233313932626664 Jan 14 01:04:40.207000 audit: BPF prog-id=242 op=UNLOAD Jan 14 01:04:40.207000 audit[5076]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=5065 pid=5076 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:40.207000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3965666364646562663262666662313566623437366233313932626664 Jan 14 01:04:40.207000 audit: BPF prog-id=244 op=LOAD Jan 14 01:04:40.207000 audit[5076]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b06e8 a2=98 a3=0 items=0 ppid=5065 pid=5076 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:40.207000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3965666364646562663262666662313566623437366233313932626664 Jan 14 01:04:40.209000 audit: BPF prog-id=245 op=LOAD Jan 14 01:04:40.209000 audit: BPF prog-id=246 op=LOAD Jan 14 01:04:40.209000 audit[5087]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=5008 pid=5087 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:40.209000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3230623164663730656665383463366635336431383134383830313736 Jan 14 01:04:40.209000 audit: BPF prog-id=246 op=UNLOAD Jan 14 01:04:40.209000 audit[5087]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=5008 pid=5087 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:40.209000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3230623164663730656665383463366635336431383134383830313736 Jan 14 01:04:40.209000 audit: BPF prog-id=247 op=LOAD Jan 14 01:04:40.209000 audit[5087]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=5008 pid=5087 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:40.209000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3230623164663730656665383463366635336431383134383830313736 Jan 14 01:04:40.209000 audit: BPF prog-id=248 op=LOAD Jan 14 01:04:40.209000 audit[5087]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=5008 pid=5087 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:40.209000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3230623164663730656665383463366635336431383134383830313736 Jan 14 01:04:40.209000 audit: BPF prog-id=248 op=UNLOAD Jan 14 01:04:40.209000 audit[5087]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=5008 pid=5087 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:40.209000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3230623164663730656665383463366635336431383134383830313736 Jan 14 01:04:40.209000 audit: BPF prog-id=247 op=UNLOAD Jan 14 01:04:40.209000 audit[5087]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=5008 pid=5087 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:40.209000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3230623164663730656665383463366635336431383134383830313736 Jan 14 01:04:40.209000 audit: BPF prog-id=249 op=LOAD Jan 14 01:04:40.209000 audit[5087]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=5008 pid=5087 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:40.209000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3230623164663730656665383463366635336431383134383830313736 Jan 14 01:04:40.234784 containerd[1695]: time="2026-01-14T01:04:40.234733786Z" level=info msg="StartContainer for \"20b1df70efe84c6f53d181488017682733d715e52904017e5dc38cce81ee1074\" returns successfully" Jan 14 01:04:40.262592 containerd[1695]: time="2026-01-14T01:04:40.262501761Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7c778bb748-vc7sq,Uid:95ab78ae-ca97-4cab-9490-03b0a50f740c,Namespace:calico-system,Attempt:0,} returns sandbox id \"9efcddebf2bffb15fb476b3192bfdddd92c2af61191b4d5a0ca3ee761d77d10e\"" Jan 14 01:04:40.264724 containerd[1695]: time="2026-01-14T01:04:40.264679326Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 14 01:04:40.705034 containerd[1695]: time="2026-01-14T01:04:40.704836283Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:04:40.707181 containerd[1695]: time="2026-01-14T01:04:40.707086275Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 14 01:04:40.707181 containerd[1695]: time="2026-01-14T01:04:40.707156274Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 14 01:04:40.707334 kubelet[3316]: E0114 01:04:40.707299 3316 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 14 01:04:40.707384 kubelet[3316]: E0114 01:04:40.707338 3316 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 14 01:04:40.707414 kubelet[3316]: E0114 01:04:40.707393 3316 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-vc7sq_calico-system(95ab78ae-ca97-4cab-9490-03b0a50f740c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 14 01:04:40.707449 kubelet[3316]: E0114 01:04:40.707419 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-vc7sq" podUID="95ab78ae-ca97-4cab-9490-03b0a50f740c" Jan 14 01:04:40.857583 containerd[1695]: time="2026-01-14T01:04:40.857471439Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-694bbc95d6-tk96l,Uid:f26be41d-3305-4b21-9d76-bde121cc2cce,Namespace:calico-apiserver,Attempt:0,}" Jan 14 01:04:40.859621 containerd[1695]: time="2026-01-14T01:04:40.859574727Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-694bbc95d6-42t9l,Uid:b7d50268-8797-458d-a912-f7456846c1f2,Namespace:calico-apiserver,Attempt:0,}" Jan 14 01:04:40.879812 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3800606738.mount: Deactivated successfully. Jan 14 01:04:40.989157 systemd-networkd[1385]: cali96388fb6566: Link UP Jan 14 01:04:40.990410 systemd-networkd[1385]: cali96388fb6566: Gained carrier Jan 14 01:04:41.003910 containerd[1695]: 2026-01-14 01:04:40.908 [INFO][5132] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--n--de0c74fc75-k8s-calico--apiserver--694bbc95d6--tk96l-eth0 calico-apiserver-694bbc95d6- calico-apiserver f26be41d-3305-4b21-9d76-bde121cc2cce 813 0 2026-01-14 01:04:09 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:694bbc95d6 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4547-0-0-n-de0c74fc75 calico-apiserver-694bbc95d6-tk96l eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali96388fb6566 [] [] }} ContainerID="4abdad7e18e2b0190c1029b0667456e2ad190a173aa2e4690636e761c1cd40a4" Namespace="calico-apiserver" Pod="calico-apiserver-694bbc95d6-tk96l" WorkloadEndpoint="ci--4547--0--0--n--de0c74fc75-k8s-calico--apiserver--694bbc95d6--tk96l-" Jan 14 01:04:41.003910 containerd[1695]: 2026-01-14 01:04:40.909 [INFO][5132] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="4abdad7e18e2b0190c1029b0667456e2ad190a173aa2e4690636e761c1cd40a4" Namespace="calico-apiserver" Pod="calico-apiserver-694bbc95d6-tk96l" WorkloadEndpoint="ci--4547--0--0--n--de0c74fc75-k8s-calico--apiserver--694bbc95d6--tk96l-eth0" Jan 14 01:04:41.003910 containerd[1695]: 2026-01-14 01:04:40.942 [INFO][5157] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="4abdad7e18e2b0190c1029b0667456e2ad190a173aa2e4690636e761c1cd40a4" HandleID="k8s-pod-network.4abdad7e18e2b0190c1029b0667456e2ad190a173aa2e4690636e761c1cd40a4" Workload="ci--4547--0--0--n--de0c74fc75-k8s-calico--apiserver--694bbc95d6--tk96l-eth0" Jan 14 01:04:41.003910 containerd[1695]: 2026-01-14 01:04:40.942 [INFO][5157] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="4abdad7e18e2b0190c1029b0667456e2ad190a173aa2e4690636e761c1cd40a4" HandleID="k8s-pod-network.4abdad7e18e2b0190c1029b0667456e2ad190a173aa2e4690636e761c1cd40a4" Workload="ci--4547--0--0--n--de0c74fc75-k8s-calico--apiserver--694bbc95d6--tk96l-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000497460), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4547-0-0-n-de0c74fc75", "pod":"calico-apiserver-694bbc95d6-tk96l", "timestamp":"2026-01-14 01:04:40.942243232 +0000 UTC"}, Hostname:"ci-4547-0-0-n-de0c74fc75", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 14 01:04:41.003910 containerd[1695]: 2026-01-14 01:04:40.942 [INFO][5157] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 14 01:04:41.003910 containerd[1695]: 2026-01-14 01:04:40.942 [INFO][5157] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 14 01:04:41.003910 containerd[1695]: 2026-01-14 01:04:40.942 [INFO][5157] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-n-de0c74fc75' Jan 14 01:04:41.003910 containerd[1695]: 2026-01-14 01:04:40.951 [INFO][5157] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.4abdad7e18e2b0190c1029b0667456e2ad190a173aa2e4690636e761c1cd40a4" host="ci-4547-0-0-n-de0c74fc75" Jan 14 01:04:41.003910 containerd[1695]: 2026-01-14 01:04:40.957 [INFO][5157] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-n-de0c74fc75" Jan 14 01:04:41.003910 containerd[1695]: 2026-01-14 01:04:40.960 [INFO][5157] ipam/ipam.go 511: Trying affinity for 192.168.31.0/26 host="ci-4547-0-0-n-de0c74fc75" Jan 14 01:04:41.003910 containerd[1695]: 2026-01-14 01:04:40.962 [INFO][5157] ipam/ipam.go 158: Attempting to load block cidr=192.168.31.0/26 host="ci-4547-0-0-n-de0c74fc75" Jan 14 01:04:41.003910 containerd[1695]: 2026-01-14 01:04:40.963 [INFO][5157] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.31.0/26 host="ci-4547-0-0-n-de0c74fc75" Jan 14 01:04:41.003910 containerd[1695]: 2026-01-14 01:04:40.963 [INFO][5157] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.31.0/26 handle="k8s-pod-network.4abdad7e18e2b0190c1029b0667456e2ad190a173aa2e4690636e761c1cd40a4" host="ci-4547-0-0-n-de0c74fc75" Jan 14 01:04:41.003910 containerd[1695]: 2026-01-14 01:04:40.964 [INFO][5157] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.4abdad7e18e2b0190c1029b0667456e2ad190a173aa2e4690636e761c1cd40a4 Jan 14 01:04:41.003910 containerd[1695]: 2026-01-14 01:04:40.969 [INFO][5157] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.31.0/26 handle="k8s-pod-network.4abdad7e18e2b0190c1029b0667456e2ad190a173aa2e4690636e761c1cd40a4" host="ci-4547-0-0-n-de0c74fc75" Jan 14 01:04:41.003910 containerd[1695]: 2026-01-14 01:04:40.975 [INFO][5157] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.31.7/26] block=192.168.31.0/26 handle="k8s-pod-network.4abdad7e18e2b0190c1029b0667456e2ad190a173aa2e4690636e761c1cd40a4" host="ci-4547-0-0-n-de0c74fc75" Jan 14 01:04:41.003910 containerd[1695]: 2026-01-14 01:04:40.975 [INFO][5157] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.31.7/26] handle="k8s-pod-network.4abdad7e18e2b0190c1029b0667456e2ad190a173aa2e4690636e761c1cd40a4" host="ci-4547-0-0-n-de0c74fc75" Jan 14 01:04:41.003910 containerd[1695]: 2026-01-14 01:04:40.976 [INFO][5157] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 14 01:04:41.003910 containerd[1695]: 2026-01-14 01:04:40.976 [INFO][5157] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.31.7/26] IPv6=[] ContainerID="4abdad7e18e2b0190c1029b0667456e2ad190a173aa2e4690636e761c1cd40a4" HandleID="k8s-pod-network.4abdad7e18e2b0190c1029b0667456e2ad190a173aa2e4690636e761c1cd40a4" Workload="ci--4547--0--0--n--de0c74fc75-k8s-calico--apiserver--694bbc95d6--tk96l-eth0" Jan 14 01:04:41.004845 containerd[1695]: 2026-01-14 01:04:40.978 [INFO][5132] cni-plugin/k8s.go 418: Populated endpoint ContainerID="4abdad7e18e2b0190c1029b0667456e2ad190a173aa2e4690636e761c1cd40a4" Namespace="calico-apiserver" Pod="calico-apiserver-694bbc95d6-tk96l" WorkloadEndpoint="ci--4547--0--0--n--de0c74fc75-k8s-calico--apiserver--694bbc95d6--tk96l-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--n--de0c74fc75-k8s-calico--apiserver--694bbc95d6--tk96l-eth0", GenerateName:"calico-apiserver-694bbc95d6-", Namespace:"calico-apiserver", SelfLink:"", UID:"f26be41d-3305-4b21-9d76-bde121cc2cce", ResourceVersion:"813", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 1, 4, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"694bbc95d6", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-n-de0c74fc75", ContainerID:"", Pod:"calico-apiserver-694bbc95d6-tk96l", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.31.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali96388fb6566", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 01:04:41.004845 containerd[1695]: 2026-01-14 01:04:40.978 [INFO][5132] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.31.7/32] ContainerID="4abdad7e18e2b0190c1029b0667456e2ad190a173aa2e4690636e761c1cd40a4" Namespace="calico-apiserver" Pod="calico-apiserver-694bbc95d6-tk96l" WorkloadEndpoint="ci--4547--0--0--n--de0c74fc75-k8s-calico--apiserver--694bbc95d6--tk96l-eth0" Jan 14 01:04:41.004845 containerd[1695]: 2026-01-14 01:04:40.978 [INFO][5132] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali96388fb6566 ContainerID="4abdad7e18e2b0190c1029b0667456e2ad190a173aa2e4690636e761c1cd40a4" Namespace="calico-apiserver" Pod="calico-apiserver-694bbc95d6-tk96l" WorkloadEndpoint="ci--4547--0--0--n--de0c74fc75-k8s-calico--apiserver--694bbc95d6--tk96l-eth0" Jan 14 01:04:41.004845 containerd[1695]: 2026-01-14 01:04:40.989 [INFO][5132] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="4abdad7e18e2b0190c1029b0667456e2ad190a173aa2e4690636e761c1cd40a4" Namespace="calico-apiserver" Pod="calico-apiserver-694bbc95d6-tk96l" WorkloadEndpoint="ci--4547--0--0--n--de0c74fc75-k8s-calico--apiserver--694bbc95d6--tk96l-eth0" Jan 14 01:04:41.004845 containerd[1695]: 2026-01-14 01:04:40.990 [INFO][5132] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="4abdad7e18e2b0190c1029b0667456e2ad190a173aa2e4690636e761c1cd40a4" Namespace="calico-apiserver" Pod="calico-apiserver-694bbc95d6-tk96l" WorkloadEndpoint="ci--4547--0--0--n--de0c74fc75-k8s-calico--apiserver--694bbc95d6--tk96l-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--n--de0c74fc75-k8s-calico--apiserver--694bbc95d6--tk96l-eth0", GenerateName:"calico-apiserver-694bbc95d6-", Namespace:"calico-apiserver", SelfLink:"", UID:"f26be41d-3305-4b21-9d76-bde121cc2cce", ResourceVersion:"813", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 1, 4, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"694bbc95d6", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-n-de0c74fc75", ContainerID:"4abdad7e18e2b0190c1029b0667456e2ad190a173aa2e4690636e761c1cd40a4", Pod:"calico-apiserver-694bbc95d6-tk96l", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.31.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali96388fb6566", MAC:"06:cc:d0:1e:39:fe", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 01:04:41.004845 containerd[1695]: 2026-01-14 01:04:41.001 [INFO][5132] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="4abdad7e18e2b0190c1029b0667456e2ad190a173aa2e4690636e761c1cd40a4" Namespace="calico-apiserver" Pod="calico-apiserver-694bbc95d6-tk96l" WorkloadEndpoint="ci--4547--0--0--n--de0c74fc75-k8s-calico--apiserver--694bbc95d6--tk96l-eth0" Jan 14 01:04:41.016000 audit[5183]: NETFILTER_CFG table=filter:131 family=2 entries=66 op=nft_register_chain pid=5183 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 14 01:04:41.019140 kernel: kauditd_printk_skb: 394 callbacks suppressed Jan 14 01:04:41.019212 kernel: audit: type=1325 audit(1768352681.016:731): table=filter:131 family=2 entries=66 op=nft_register_chain pid=5183 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 14 01:04:41.016000 audit[5183]: SYSCALL arch=c000003e syscall=46 success=yes exit=32944 a0=3 a1=7fff41c79a30 a2=0 a3=7fff41c79a1c items=0 ppid=4451 pid=5183 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:41.022153 kernel: audit: type=1300 audit(1768352681.016:731): arch=c000003e syscall=46 success=yes exit=32944 a0=3 a1=7fff41c79a30 a2=0 a3=7fff41c79a1c items=0 ppid=4451 pid=5183 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:41.034182 kubelet[3316]: E0114 01:04:41.033907 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-vc7sq" podUID="95ab78ae-ca97-4cab-9490-03b0a50f740c" Jan 14 01:04:41.016000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 14 01:04:41.040147 kernel: audit: type=1327 audit(1768352681.016:731): proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 14 01:04:41.044556 containerd[1695]: time="2026-01-14T01:04:41.044466301Z" level=info msg="connecting to shim 4abdad7e18e2b0190c1029b0667456e2ad190a173aa2e4690636e761c1cd40a4" address="unix:///run/containerd/s/81eda2e01d5f8ab539abcc1a4df6f64a2eeae0007f6833b63d27a96826331504" namespace=k8s.io protocol=ttrpc version=3 Jan 14 01:04:41.072373 kubelet[3316]: I0114 01:04:41.072328 3316 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-66bc5c9577-d9j8s" podStartSLOduration=42.072312127000004 podStartE2EDuration="42.072312127s" podCreationTimestamp="2026-01-14 01:03:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-14 01:04:41.071454933 +0000 UTC m=+47.323118240" watchObservedRunningTime="2026-01-14 01:04:41.072312127 +0000 UTC m=+47.323975435" Jan 14 01:04:41.077000 audit[5213]: NETFILTER_CFG table=filter:132 family=2 entries=14 op=nft_register_rule pid=5213 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:04:41.083088 kernel: audit: type=1325 audit(1768352681.077:732): table=filter:132 family=2 entries=14 op=nft_register_rule pid=5213 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:04:41.083179 kernel: audit: type=1300 audit(1768352681.077:732): arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7fff2f51d9d0 a2=0 a3=7fff2f51d9bc items=0 ppid=3467 pid=5213 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:41.077000 audit[5213]: SYSCALL arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7fff2f51d9d0 a2=0 a3=7fff2f51d9bc items=0 ppid=3467 pid=5213 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:41.077000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:04:41.095072 kernel: audit: type=1327 audit(1768352681.077:732): proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:04:41.096449 systemd[1]: Started cri-containerd-4abdad7e18e2b0190c1029b0667456e2ad190a173aa2e4690636e761c1cd40a4.scope - libcontainer container 4abdad7e18e2b0190c1029b0667456e2ad190a173aa2e4690636e761c1cd40a4. Jan 14 01:04:41.099000 audit[5213]: NETFILTER_CFG table=nat:133 family=2 entries=20 op=nft_register_rule pid=5213 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:04:41.104073 kernel: audit: type=1325 audit(1768352681.099:733): table=nat:133 family=2 entries=20 op=nft_register_rule pid=5213 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:04:41.099000 audit[5213]: SYSCALL arch=c000003e syscall=46 success=yes exit=5772 a0=3 a1=7fff2f51d9d0 a2=0 a3=7fff2f51d9bc items=0 ppid=3467 pid=5213 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:41.109079 kernel: audit: type=1300 audit(1768352681.099:733): arch=c000003e syscall=46 success=yes exit=5772 a0=3 a1=7fff2f51d9d0 a2=0 a3=7fff2f51d9bc items=0 ppid=3467 pid=5213 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:41.099000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:04:41.112249 kernel: audit: type=1327 audit(1768352681.099:733): proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:04:41.123000 audit: BPF prog-id=250 op=LOAD Jan 14 01:04:41.126132 kernel: audit: type=1334 audit(1768352681.123:734): prog-id=250 op=LOAD Jan 14 01:04:41.125000 audit: BPF prog-id=251 op=LOAD Jan 14 01:04:41.125000 audit[5202]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=5192 pid=5202 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:41.125000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3461626461643765313865326230313930633130323962303636373435 Jan 14 01:04:41.125000 audit: BPF prog-id=251 op=UNLOAD Jan 14 01:04:41.125000 audit[5202]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=5192 pid=5202 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:41.125000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3461626461643765313865326230313930633130323962303636373435 Jan 14 01:04:41.125000 audit: BPF prog-id=252 op=LOAD Jan 14 01:04:41.125000 audit[5202]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=5192 pid=5202 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:41.125000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3461626461643765313865326230313930633130323962303636373435 Jan 14 01:04:41.125000 audit: BPF prog-id=253 op=LOAD Jan 14 01:04:41.125000 audit[5202]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=5192 pid=5202 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:41.125000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3461626461643765313865326230313930633130323962303636373435 Jan 14 01:04:41.126000 audit: BPF prog-id=253 op=UNLOAD Jan 14 01:04:41.126000 audit[5202]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=5192 pid=5202 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:41.126000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3461626461643765313865326230313930633130323962303636373435 Jan 14 01:04:41.126000 audit: BPF prog-id=252 op=UNLOAD Jan 14 01:04:41.126000 audit[5202]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=5192 pid=5202 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:41.126000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3461626461643765313865326230313930633130323962303636373435 Jan 14 01:04:41.126000 audit: BPF prog-id=254 op=LOAD Jan 14 01:04:41.126000 audit[5202]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=5192 pid=5202 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:41.126000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3461626461643765313865326230313930633130323962303636373435 Jan 14 01:04:41.137997 systemd-networkd[1385]: cali84a2c086eb0: Link UP Jan 14 01:04:41.139037 systemd-networkd[1385]: cali84a2c086eb0: Gained carrier Jan 14 01:04:41.155802 containerd[1695]: 2026-01-14 01:04:40.922 [INFO][5142] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--n--de0c74fc75-k8s-calico--apiserver--694bbc95d6--42t9l-eth0 calico-apiserver-694bbc95d6- calico-apiserver b7d50268-8797-458d-a912-f7456846c1f2 808 0 2026-01-14 01:04:09 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:694bbc95d6 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4547-0-0-n-de0c74fc75 calico-apiserver-694bbc95d6-42t9l eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali84a2c086eb0 [] [] }} ContainerID="e2c2355c20ab5fcd06673247dc43026ffc5247ef74592f46e90759e51278cea3" Namespace="calico-apiserver" Pod="calico-apiserver-694bbc95d6-42t9l" WorkloadEndpoint="ci--4547--0--0--n--de0c74fc75-k8s-calico--apiserver--694bbc95d6--42t9l-" Jan 14 01:04:41.155802 containerd[1695]: 2026-01-14 01:04:40.923 [INFO][5142] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="e2c2355c20ab5fcd06673247dc43026ffc5247ef74592f46e90759e51278cea3" Namespace="calico-apiserver" Pod="calico-apiserver-694bbc95d6-42t9l" WorkloadEndpoint="ci--4547--0--0--n--de0c74fc75-k8s-calico--apiserver--694bbc95d6--42t9l-eth0" Jan 14 01:04:41.155802 containerd[1695]: 2026-01-14 01:04:40.961 [INFO][5165] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e2c2355c20ab5fcd06673247dc43026ffc5247ef74592f46e90759e51278cea3" HandleID="k8s-pod-network.e2c2355c20ab5fcd06673247dc43026ffc5247ef74592f46e90759e51278cea3" Workload="ci--4547--0--0--n--de0c74fc75-k8s-calico--apiserver--694bbc95d6--42t9l-eth0" Jan 14 01:04:41.155802 containerd[1695]: 2026-01-14 01:04:40.961 [INFO][5165] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="e2c2355c20ab5fcd06673247dc43026ffc5247ef74592f46e90759e51278cea3" HandleID="k8s-pod-network.e2c2355c20ab5fcd06673247dc43026ffc5247ef74592f46e90759e51278cea3" Workload="ci--4547--0--0--n--de0c74fc75-k8s-calico--apiserver--694bbc95d6--42t9l-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f030), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4547-0-0-n-de0c74fc75", "pod":"calico-apiserver-694bbc95d6-42t9l", "timestamp":"2026-01-14 01:04:40.961457638 +0000 UTC"}, Hostname:"ci-4547-0-0-n-de0c74fc75", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 14 01:04:41.155802 containerd[1695]: 2026-01-14 01:04:40.961 [INFO][5165] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 14 01:04:41.155802 containerd[1695]: 2026-01-14 01:04:40.976 [INFO][5165] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 14 01:04:41.155802 containerd[1695]: 2026-01-14 01:04:40.976 [INFO][5165] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-n-de0c74fc75' Jan 14 01:04:41.155802 containerd[1695]: 2026-01-14 01:04:41.052 [INFO][5165] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.e2c2355c20ab5fcd06673247dc43026ffc5247ef74592f46e90759e51278cea3" host="ci-4547-0-0-n-de0c74fc75" Jan 14 01:04:41.155802 containerd[1695]: 2026-01-14 01:04:41.070 [INFO][5165] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-n-de0c74fc75" Jan 14 01:04:41.155802 containerd[1695]: 2026-01-14 01:04:41.086 [INFO][5165] ipam/ipam.go 511: Trying affinity for 192.168.31.0/26 host="ci-4547-0-0-n-de0c74fc75" Jan 14 01:04:41.155802 containerd[1695]: 2026-01-14 01:04:41.095 [INFO][5165] ipam/ipam.go 158: Attempting to load block cidr=192.168.31.0/26 host="ci-4547-0-0-n-de0c74fc75" Jan 14 01:04:41.155802 containerd[1695]: 2026-01-14 01:04:41.100 [INFO][5165] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.31.0/26 host="ci-4547-0-0-n-de0c74fc75" Jan 14 01:04:41.155802 containerd[1695]: 2026-01-14 01:04:41.100 [INFO][5165] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.31.0/26 handle="k8s-pod-network.e2c2355c20ab5fcd06673247dc43026ffc5247ef74592f46e90759e51278cea3" host="ci-4547-0-0-n-de0c74fc75" Jan 14 01:04:41.155802 containerd[1695]: 2026-01-14 01:04:41.104 [INFO][5165] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.e2c2355c20ab5fcd06673247dc43026ffc5247ef74592f46e90759e51278cea3 Jan 14 01:04:41.155802 containerd[1695]: 2026-01-14 01:04:41.112 [INFO][5165] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.31.0/26 handle="k8s-pod-network.e2c2355c20ab5fcd06673247dc43026ffc5247ef74592f46e90759e51278cea3" host="ci-4547-0-0-n-de0c74fc75" Jan 14 01:04:41.155802 containerd[1695]: 2026-01-14 01:04:41.130 [INFO][5165] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.31.8/26] block=192.168.31.0/26 handle="k8s-pod-network.e2c2355c20ab5fcd06673247dc43026ffc5247ef74592f46e90759e51278cea3" host="ci-4547-0-0-n-de0c74fc75" Jan 14 01:04:41.155802 containerd[1695]: 2026-01-14 01:04:41.131 [INFO][5165] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.31.8/26] handle="k8s-pod-network.e2c2355c20ab5fcd06673247dc43026ffc5247ef74592f46e90759e51278cea3" host="ci-4547-0-0-n-de0c74fc75" Jan 14 01:04:41.155802 containerd[1695]: 2026-01-14 01:04:41.131 [INFO][5165] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 14 01:04:41.155802 containerd[1695]: 2026-01-14 01:04:41.131 [INFO][5165] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.31.8/26] IPv6=[] ContainerID="e2c2355c20ab5fcd06673247dc43026ffc5247ef74592f46e90759e51278cea3" HandleID="k8s-pod-network.e2c2355c20ab5fcd06673247dc43026ffc5247ef74592f46e90759e51278cea3" Workload="ci--4547--0--0--n--de0c74fc75-k8s-calico--apiserver--694bbc95d6--42t9l-eth0" Jan 14 01:04:41.156321 containerd[1695]: 2026-01-14 01:04:41.132 [INFO][5142] cni-plugin/k8s.go 418: Populated endpoint ContainerID="e2c2355c20ab5fcd06673247dc43026ffc5247ef74592f46e90759e51278cea3" Namespace="calico-apiserver" Pod="calico-apiserver-694bbc95d6-42t9l" WorkloadEndpoint="ci--4547--0--0--n--de0c74fc75-k8s-calico--apiserver--694bbc95d6--42t9l-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--n--de0c74fc75-k8s-calico--apiserver--694bbc95d6--42t9l-eth0", GenerateName:"calico-apiserver-694bbc95d6-", Namespace:"calico-apiserver", SelfLink:"", UID:"b7d50268-8797-458d-a912-f7456846c1f2", ResourceVersion:"808", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 1, 4, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"694bbc95d6", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-n-de0c74fc75", ContainerID:"", Pod:"calico-apiserver-694bbc95d6-42t9l", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.31.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali84a2c086eb0", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 01:04:41.156321 containerd[1695]: 2026-01-14 01:04:41.133 [INFO][5142] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.31.8/32] ContainerID="e2c2355c20ab5fcd06673247dc43026ffc5247ef74592f46e90759e51278cea3" Namespace="calico-apiserver" Pod="calico-apiserver-694bbc95d6-42t9l" WorkloadEndpoint="ci--4547--0--0--n--de0c74fc75-k8s-calico--apiserver--694bbc95d6--42t9l-eth0" Jan 14 01:04:41.156321 containerd[1695]: 2026-01-14 01:04:41.133 [INFO][5142] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali84a2c086eb0 ContainerID="e2c2355c20ab5fcd06673247dc43026ffc5247ef74592f46e90759e51278cea3" Namespace="calico-apiserver" Pod="calico-apiserver-694bbc95d6-42t9l" WorkloadEndpoint="ci--4547--0--0--n--de0c74fc75-k8s-calico--apiserver--694bbc95d6--42t9l-eth0" Jan 14 01:04:41.156321 containerd[1695]: 2026-01-14 01:04:41.138 [INFO][5142] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e2c2355c20ab5fcd06673247dc43026ffc5247ef74592f46e90759e51278cea3" Namespace="calico-apiserver" Pod="calico-apiserver-694bbc95d6-42t9l" WorkloadEndpoint="ci--4547--0--0--n--de0c74fc75-k8s-calico--apiserver--694bbc95d6--42t9l-eth0" Jan 14 01:04:41.156321 containerd[1695]: 2026-01-14 01:04:41.139 [INFO][5142] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="e2c2355c20ab5fcd06673247dc43026ffc5247ef74592f46e90759e51278cea3" Namespace="calico-apiserver" Pod="calico-apiserver-694bbc95d6-42t9l" WorkloadEndpoint="ci--4547--0--0--n--de0c74fc75-k8s-calico--apiserver--694bbc95d6--42t9l-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--n--de0c74fc75-k8s-calico--apiserver--694bbc95d6--42t9l-eth0", GenerateName:"calico-apiserver-694bbc95d6-", Namespace:"calico-apiserver", SelfLink:"", UID:"b7d50268-8797-458d-a912-f7456846c1f2", ResourceVersion:"808", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 1, 4, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"694bbc95d6", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-n-de0c74fc75", ContainerID:"e2c2355c20ab5fcd06673247dc43026ffc5247ef74592f46e90759e51278cea3", Pod:"calico-apiserver-694bbc95d6-42t9l", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.31.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali84a2c086eb0", MAC:"46:fe:9b:f1:b7:15", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 01:04:41.156321 containerd[1695]: 2026-01-14 01:04:41.153 [INFO][5142] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="e2c2355c20ab5fcd06673247dc43026ffc5247ef74592f46e90759e51278cea3" Namespace="calico-apiserver" Pod="calico-apiserver-694bbc95d6-42t9l" WorkloadEndpoint="ci--4547--0--0--n--de0c74fc75-k8s-calico--apiserver--694bbc95d6--42t9l-eth0" Jan 14 01:04:41.193000 audit[5238]: NETFILTER_CFG table=filter:134 family=2 entries=57 op=nft_register_chain pid=5238 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 14 01:04:41.193000 audit[5238]: SYSCALL arch=c000003e syscall=46 success=yes exit=27812 a0=3 a1=7ffc95b221d0 a2=0 a3=7ffc95b221bc items=0 ppid=4451 pid=5238 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:41.193000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 14 01:04:41.195496 containerd[1695]: time="2026-01-14T01:04:41.195414923Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-694bbc95d6-tk96l,Uid:f26be41d-3305-4b21-9d76-bde121cc2cce,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"4abdad7e18e2b0190c1029b0667456e2ad190a173aa2e4690636e761c1cd40a4\"" Jan 14 01:04:41.198195 containerd[1695]: time="2026-01-14T01:04:41.198177961Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 14 01:04:41.199380 containerd[1695]: time="2026-01-14T01:04:41.199359509Z" level=info msg="connecting to shim e2c2355c20ab5fcd06673247dc43026ffc5247ef74592f46e90759e51278cea3" address="unix:///run/containerd/s/6dfb2d21238d0a34dc6f97f98a7a17672b93d821ab3d7cf95fb55616f139631f" namespace=k8s.io protocol=ttrpc version=3 Jan 14 01:04:41.228209 systemd[1]: Started cri-containerd-e2c2355c20ab5fcd06673247dc43026ffc5247ef74592f46e90759e51278cea3.scope - libcontainer container e2c2355c20ab5fcd06673247dc43026ffc5247ef74592f46e90759e51278cea3. Jan 14 01:04:41.236144 systemd-networkd[1385]: calia83c40726fe: Gained IPv6LL Jan 14 01:04:41.238000 audit: BPF prog-id=255 op=LOAD Jan 14 01:04:41.241000 audit: BPF prog-id=256 op=LOAD Jan 14 01:04:41.241000 audit[5259]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000138238 a2=98 a3=0 items=0 ppid=5248 pid=5259 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:41.241000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6532633233353563323061623566636430363637333234376463343330 Jan 14 01:04:41.241000 audit: BPF prog-id=256 op=UNLOAD Jan 14 01:04:41.241000 audit[5259]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=5248 pid=5259 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:41.241000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6532633233353563323061623566636430363637333234376463343330 Jan 14 01:04:41.242000 audit: BPF prog-id=257 op=LOAD Jan 14 01:04:41.242000 audit[5259]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000138488 a2=98 a3=0 items=0 ppid=5248 pid=5259 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:41.242000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6532633233353563323061623566636430363637333234376463343330 Jan 14 01:04:41.242000 audit: BPF prog-id=258 op=LOAD Jan 14 01:04:41.242000 audit[5259]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000138218 a2=98 a3=0 items=0 ppid=5248 pid=5259 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:41.242000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6532633233353563323061623566636430363637333234376463343330 Jan 14 01:04:41.243000 audit: BPF prog-id=258 op=UNLOAD Jan 14 01:04:41.243000 audit[5259]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=5248 pid=5259 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:41.243000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6532633233353563323061623566636430363637333234376463343330 Jan 14 01:04:41.243000 audit: BPF prog-id=257 op=UNLOAD Jan 14 01:04:41.243000 audit[5259]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=5248 pid=5259 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:41.243000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6532633233353563323061623566636430363637333234376463343330 Jan 14 01:04:41.244000 audit: BPF prog-id=259 op=LOAD Jan 14 01:04:41.244000 audit[5259]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001386e8 a2=98 a3=0 items=0 ppid=5248 pid=5259 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:41.244000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6532633233353563323061623566636430363637333234376463343330 Jan 14 01:04:41.288925 containerd[1695]: time="2026-01-14T01:04:41.288898390Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-694bbc95d6-42t9l,Uid:b7d50268-8797-458d-a912-f7456846c1f2,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"e2c2355c20ab5fcd06673247dc43026ffc5247ef74592f46e90759e51278cea3\"" Jan 14 01:04:41.512841 containerd[1695]: time="2026-01-14T01:04:41.512583515Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:04:41.514132 containerd[1695]: time="2026-01-14T01:04:41.514102629Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 14 01:04:41.517061 containerd[1695]: time="2026-01-14T01:04:41.515077687Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 14 01:04:41.517116 kubelet[3316]: E0114 01:04:41.515189 3316 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:04:41.517116 kubelet[3316]: E0114 01:04:41.515225 3316 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:04:41.517116 kubelet[3316]: E0114 01:04:41.515417 3316 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-694bbc95d6-tk96l_calico-apiserver(f26be41d-3305-4b21-9d76-bde121cc2cce): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 14 01:04:41.517116 kubelet[3316]: E0114 01:04:41.515450 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-694bbc95d6-tk96l" podUID="f26be41d-3305-4b21-9d76-bde121cc2cce" Jan 14 01:04:41.517532 containerd[1695]: time="2026-01-14T01:04:41.517369743Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 14 01:04:41.852401 containerd[1695]: time="2026-01-14T01:04:41.852342813Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:04:41.853929 containerd[1695]: time="2026-01-14T01:04:41.853876771Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 14 01:04:41.853929 containerd[1695]: time="2026-01-14T01:04:41.853901641Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 14 01:04:41.854214 kubelet[3316]: E0114 01:04:41.854107 3316 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:04:41.854214 kubelet[3316]: E0114 01:04:41.854143 3316 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:04:41.855130 kubelet[3316]: E0114 01:04:41.854214 3316 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-694bbc95d6-42t9l_calico-apiserver(b7d50268-8797-458d-a912-f7456846c1f2): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 14 01:04:41.855130 kubelet[3316]: E0114 01:04:41.854241 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-694bbc95d6-42t9l" podUID="b7d50268-8797-458d-a912-f7456846c1f2" Jan 14 01:04:41.940236 systemd-networkd[1385]: cali4679894d7d6: Gained IPv6LL Jan 14 01:04:42.046320 kubelet[3316]: E0114 01:04:42.046197 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-694bbc95d6-42t9l" podUID="b7d50268-8797-458d-a912-f7456846c1f2" Jan 14 01:04:42.048614 kubelet[3316]: E0114 01:04:42.048118 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-694bbc95d6-tk96l" podUID="f26be41d-3305-4b21-9d76-bde121cc2cce" Jan 14 01:04:42.050636 kubelet[3316]: E0114 01:04:42.050614 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-vc7sq" podUID="95ab78ae-ca97-4cab-9490-03b0a50f740c" Jan 14 01:04:42.118000 audit[5289]: NETFILTER_CFG table=filter:135 family=2 entries=14 op=nft_register_rule pid=5289 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:04:42.118000 audit[5289]: SYSCALL arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7ffd8f18a610 a2=0 a3=7ffd8f18a5fc items=0 ppid=3467 pid=5289 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:42.118000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:04:42.139000 audit[5289]: NETFILTER_CFG table=nat:136 family=2 entries=56 op=nft_register_chain pid=5289 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:04:42.139000 audit[5289]: SYSCALL arch=c000003e syscall=46 success=yes exit=19860 a0=3 a1=7ffd8f18a610 a2=0 a3=7ffd8f18a5fc items=0 ppid=3467 pid=5289 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:04:42.139000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:04:42.443390 kubelet[3316]: I0114 01:04:42.442435 3316 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 14 01:04:42.452228 systemd-networkd[1385]: cali96388fb6566: Gained IPv6LL Jan 14 01:04:42.900197 systemd-networkd[1385]: cali84a2c086eb0: Gained IPv6LL Jan 14 01:04:43.052995 kubelet[3316]: E0114 01:04:43.052860 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-694bbc95d6-tk96l" podUID="f26be41d-3305-4b21-9d76-bde121cc2cce" Jan 14 01:04:43.053736 kubelet[3316]: E0114 01:04:43.053587 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-694bbc95d6-42t9l" podUID="b7d50268-8797-458d-a912-f7456846c1f2" Jan 14 01:04:49.860269 containerd[1695]: time="2026-01-14T01:04:49.860206567Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 14 01:04:50.186703 containerd[1695]: time="2026-01-14T01:04:50.186495317Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:04:50.188118 containerd[1695]: time="2026-01-14T01:04:50.188031274Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 14 01:04:50.188118 containerd[1695]: time="2026-01-14T01:04:50.188077552Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 14 01:04:50.188271 kubelet[3316]: E0114 01:04:50.188235 3316 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 14 01:04:50.188791 kubelet[3316]: E0114 01:04:50.188282 3316 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 14 01:04:50.188791 kubelet[3316]: E0114 01:04:50.188474 3316 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-d94f85b8d-frfrp_calico-system(51002e26-f95b-49f1-8f48-be4a381935eb): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 14 01:04:50.189012 containerd[1695]: time="2026-01-14T01:04:50.188696472Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 14 01:04:50.517570 containerd[1695]: time="2026-01-14T01:04:50.517443296Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:04:50.519970 containerd[1695]: time="2026-01-14T01:04:50.519930385Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 14 01:04:50.520069 containerd[1695]: time="2026-01-14T01:04:50.520004150Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 14 01:04:50.520249 kubelet[3316]: E0114 01:04:50.520215 3316 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 14 01:04:50.520319 kubelet[3316]: E0114 01:04:50.520310 3316 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 14 01:04:50.520574 kubelet[3316]: E0114 01:04:50.520509 3316 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-86656bdf75-c9kjr_calico-system(81b57aeb-9645-4cab-a7a2-931a98fd5ce6): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 14 01:04:50.520574 kubelet[3316]: E0114 01:04:50.520546 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-86656bdf75-c9kjr" podUID="81b57aeb-9645-4cab-a7a2-931a98fd5ce6" Jan 14 01:04:50.521647 containerd[1695]: time="2026-01-14T01:04:50.521627367Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 14 01:04:50.842406 containerd[1695]: time="2026-01-14T01:04:50.842242215Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:04:50.845469 containerd[1695]: time="2026-01-14T01:04:50.845364538Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 14 01:04:50.845469 containerd[1695]: time="2026-01-14T01:04:50.845443061Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 14 01:04:50.845609 kubelet[3316]: E0114 01:04:50.845577 3316 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 14 01:04:50.845649 kubelet[3316]: E0114 01:04:50.845619 3316 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 14 01:04:50.845748 kubelet[3316]: E0114 01:04:50.845723 3316 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-d94f85b8d-frfrp_calico-system(51002e26-f95b-49f1-8f48-be4a381935eb): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 14 01:04:50.845876 kubelet[3316]: E0114 01:04:50.845763 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-d94f85b8d-frfrp" podUID="51002e26-f95b-49f1-8f48-be4a381935eb" Jan 14 01:04:52.856143 containerd[1695]: time="2026-01-14T01:04:52.856037479Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 14 01:04:53.198639 containerd[1695]: time="2026-01-14T01:04:53.198308551Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:04:53.200139 containerd[1695]: time="2026-01-14T01:04:53.200037628Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 14 01:04:53.200139 containerd[1695]: time="2026-01-14T01:04:53.200123962Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 14 01:04:53.200501 kubelet[3316]: E0114 01:04:53.200450 3316 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 14 01:04:53.200501 kubelet[3316]: E0114 01:04:53.200487 3316 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 14 01:04:53.201325 kubelet[3316]: E0114 01:04:53.200846 3316 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-gdh9l_calico-system(4395ad87-346f-47f3-8e06-f63944f13a5d): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 14 01:04:53.202068 containerd[1695]: time="2026-01-14T01:04:53.201985903Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 14 01:04:53.525836 containerd[1695]: time="2026-01-14T01:04:53.525394289Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:04:53.528035 containerd[1695]: time="2026-01-14T01:04:53.527904692Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 14 01:04:53.528035 containerd[1695]: time="2026-01-14T01:04:53.527983364Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 14 01:04:53.528186 kubelet[3316]: E0114 01:04:53.528153 3316 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 14 01:04:53.528223 kubelet[3316]: E0114 01:04:53.528191 3316 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 14 01:04:53.528271 kubelet[3316]: E0114 01:04:53.528251 3316 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-gdh9l_calico-system(4395ad87-346f-47f3-8e06-f63944f13a5d): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 14 01:04:53.528310 kubelet[3316]: E0114 01:04:53.528289 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-gdh9l" podUID="4395ad87-346f-47f3-8e06-f63944f13a5d" Jan 14 01:04:53.858916 containerd[1695]: time="2026-01-14T01:04:53.858462056Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 14 01:04:54.200174 containerd[1695]: time="2026-01-14T01:04:54.199870726Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:04:54.201458 containerd[1695]: time="2026-01-14T01:04:54.201414526Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 14 01:04:54.201741 containerd[1695]: time="2026-01-14T01:04:54.201494013Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 14 01:04:54.201778 kubelet[3316]: E0114 01:04:54.201676 3316 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 14 01:04:54.201778 kubelet[3316]: E0114 01:04:54.201712 3316 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 14 01:04:54.202488 kubelet[3316]: E0114 01:04:54.202118 3316 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-vc7sq_calico-system(95ab78ae-ca97-4cab-9490-03b0a50f740c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 14 01:04:54.202488 kubelet[3316]: E0114 01:04:54.202153 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-vc7sq" podUID="95ab78ae-ca97-4cab-9490-03b0a50f740c" Jan 14 01:04:56.856357 containerd[1695]: time="2026-01-14T01:04:56.856160787Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 14 01:04:57.183517 containerd[1695]: time="2026-01-14T01:04:57.183203604Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:04:57.185230 containerd[1695]: time="2026-01-14T01:04:57.185159113Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 14 01:04:57.185314 containerd[1695]: time="2026-01-14T01:04:57.185248805Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 14 01:04:57.185473 kubelet[3316]: E0114 01:04:57.185436 3316 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:04:57.185800 kubelet[3316]: E0114 01:04:57.185480 3316 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:04:57.185800 kubelet[3316]: E0114 01:04:57.185556 3316 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-694bbc95d6-tk96l_calico-apiserver(f26be41d-3305-4b21-9d76-bde121cc2cce): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 14 01:04:57.185800 kubelet[3316]: E0114 01:04:57.185586 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-694bbc95d6-tk96l" podUID="f26be41d-3305-4b21-9d76-bde121cc2cce" Jan 14 01:04:57.857224 containerd[1695]: time="2026-01-14T01:04:57.857090379Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 14 01:04:58.180482 containerd[1695]: time="2026-01-14T01:04:58.180277558Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:04:58.181832 containerd[1695]: time="2026-01-14T01:04:58.181758002Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 14 01:04:58.181897 containerd[1695]: time="2026-01-14T01:04:58.181820526Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 14 01:04:58.182122 kubelet[3316]: E0114 01:04:58.182089 3316 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:04:58.182771 kubelet[3316]: E0114 01:04:58.182131 3316 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:04:58.182771 kubelet[3316]: E0114 01:04:58.182200 3316 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-694bbc95d6-42t9l_calico-apiserver(b7d50268-8797-458d-a912-f7456846c1f2): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 14 01:04:58.182771 kubelet[3316]: E0114 01:04:58.182225 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-694bbc95d6-42t9l" podUID="b7d50268-8797-458d-a912-f7456846c1f2" Jan 14 01:05:01.052511 update_engine[1668]: I20260114 01:05:01.050585 1668 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Jan 14 01:05:01.052511 update_engine[1668]: I20260114 01:05:01.050629 1668 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Jan 14 01:05:01.052511 update_engine[1668]: I20260114 01:05:01.050812 1668 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Jan 14 01:05:01.053427 update_engine[1668]: I20260114 01:05:01.053406 1668 omaha_request_params.cc:62] Current group set to alpha Jan 14 01:05:01.054986 update_engine[1668]: I20260114 01:05:01.054759 1668 update_attempter.cc:499] Already updated boot flags. Skipping. Jan 14 01:05:01.054986 update_engine[1668]: I20260114 01:05:01.054775 1668 update_attempter.cc:643] Scheduling an action processor start. Jan 14 01:05:01.054986 update_engine[1668]: I20260114 01:05:01.054796 1668 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Jan 14 01:05:01.055101 locksmithd[1732]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Jan 14 01:05:01.063154 update_engine[1668]: I20260114 01:05:01.062192 1668 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Jan 14 01:05:01.063154 update_engine[1668]: I20260114 01:05:01.062280 1668 omaha_request_action.cc:271] Posting an Omaha request to disabled Jan 14 01:05:01.063154 update_engine[1668]: I20260114 01:05:01.062289 1668 omaha_request_action.cc:272] Request: Jan 14 01:05:01.063154 update_engine[1668]: Jan 14 01:05:01.063154 update_engine[1668]: Jan 14 01:05:01.063154 update_engine[1668]: Jan 14 01:05:01.063154 update_engine[1668]: Jan 14 01:05:01.063154 update_engine[1668]: Jan 14 01:05:01.063154 update_engine[1668]: Jan 14 01:05:01.063154 update_engine[1668]: Jan 14 01:05:01.063154 update_engine[1668]: Jan 14 01:05:01.063154 update_engine[1668]: I20260114 01:05:01.062295 1668 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jan 14 01:05:01.066547 update_engine[1668]: I20260114 01:05:01.066525 1668 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jan 14 01:05:01.067102 update_engine[1668]: I20260114 01:05:01.067082 1668 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jan 14 01:05:01.075072 update_engine[1668]: E20260114 01:05:01.075037 1668 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled (Domain name not found) Jan 14 01:05:01.075203 update_engine[1668]: I20260114 01:05:01.075177 1668 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Jan 14 01:05:03.858933 kubelet[3316]: E0114 01:05:03.858856 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-86656bdf75-c9kjr" podUID="81b57aeb-9645-4cab-a7a2-931a98fd5ce6" Jan 14 01:05:03.859490 kubelet[3316]: E0114 01:05:03.858865 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-d94f85b8d-frfrp" podUID="51002e26-f95b-49f1-8f48-be4a381935eb" Jan 14 01:05:06.858359 kubelet[3316]: E0114 01:05:06.858302 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-gdh9l" podUID="4395ad87-346f-47f3-8e06-f63944f13a5d" Jan 14 01:05:07.856884 kubelet[3316]: E0114 01:05:07.856310 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-694bbc95d6-tk96l" podUID="f26be41d-3305-4b21-9d76-bde121cc2cce" Jan 14 01:05:08.857435 kubelet[3316]: E0114 01:05:08.857155 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-vc7sq" podUID="95ab78ae-ca97-4cab-9490-03b0a50f740c" Jan 14 01:05:09.857339 kubelet[3316]: E0114 01:05:09.857230 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-694bbc95d6-42t9l" podUID="b7d50268-8797-458d-a912-f7456846c1f2" Jan 14 01:05:11.041495 update_engine[1668]: I20260114 01:05:11.041036 1668 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jan 14 01:05:11.041495 update_engine[1668]: I20260114 01:05:11.041134 1668 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jan 14 01:05:11.041495 update_engine[1668]: I20260114 01:05:11.041427 1668 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jan 14 01:05:11.048174 update_engine[1668]: E20260114 01:05:11.048123 1668 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled (Domain name not found) Jan 14 01:05:11.048288 update_engine[1668]: I20260114 01:05:11.048213 1668 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Jan 14 01:05:15.858225 containerd[1695]: time="2026-01-14T01:05:15.858020122Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 14 01:05:16.236916 containerd[1695]: time="2026-01-14T01:05:16.236804797Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:05:16.239013 containerd[1695]: time="2026-01-14T01:05:16.238966732Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 14 01:05:16.239111 containerd[1695]: time="2026-01-14T01:05:16.239039751Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 14 01:05:16.239270 kubelet[3316]: E0114 01:05:16.239244 3316 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 14 01:05:16.239797 kubelet[3316]: E0114 01:05:16.239566 3316 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 14 01:05:16.239797 kubelet[3316]: E0114 01:05:16.239686 3316 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-d94f85b8d-frfrp_calico-system(51002e26-f95b-49f1-8f48-be4a381935eb): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 14 01:05:16.241205 containerd[1695]: time="2026-01-14T01:05:16.241119317Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 14 01:05:16.573086 containerd[1695]: time="2026-01-14T01:05:16.572550076Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:05:16.574602 containerd[1695]: time="2026-01-14T01:05:16.574522723Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 14 01:05:16.574602 containerd[1695]: time="2026-01-14T01:05:16.574573012Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 14 01:05:16.574726 kubelet[3316]: E0114 01:05:16.574699 3316 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 14 01:05:16.574802 kubelet[3316]: E0114 01:05:16.574737 3316 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 14 01:05:16.574825 kubelet[3316]: E0114 01:05:16.574800 3316 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-d94f85b8d-frfrp_calico-system(51002e26-f95b-49f1-8f48-be4a381935eb): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 14 01:05:16.574825 kubelet[3316]: E0114 01:05:16.574833 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-d94f85b8d-frfrp" podUID="51002e26-f95b-49f1-8f48-be4a381935eb" Jan 14 01:05:16.857069 containerd[1695]: time="2026-01-14T01:05:16.856877220Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 14 01:05:17.186199 containerd[1695]: time="2026-01-14T01:05:17.186085646Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:05:17.192311 containerd[1695]: time="2026-01-14T01:05:17.192236635Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 14 01:05:17.192449 containerd[1695]: time="2026-01-14T01:05:17.192330404Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 14 01:05:17.193011 kubelet[3316]: E0114 01:05:17.192587 3316 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 14 01:05:17.193011 kubelet[3316]: E0114 01:05:17.192633 3316 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 14 01:05:17.193011 kubelet[3316]: E0114 01:05:17.192697 3316 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-86656bdf75-c9kjr_calico-system(81b57aeb-9645-4cab-a7a2-931a98fd5ce6): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 14 01:05:17.193011 kubelet[3316]: E0114 01:05:17.192726 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-86656bdf75-c9kjr" podUID="81b57aeb-9645-4cab-a7a2-931a98fd5ce6" Jan 14 01:05:17.858502 containerd[1695]: time="2026-01-14T01:05:17.857285362Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 14 01:05:18.183973 containerd[1695]: time="2026-01-14T01:05:18.183879206Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:05:18.185640 containerd[1695]: time="2026-01-14T01:05:18.185549543Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 14 01:05:18.185640 containerd[1695]: time="2026-01-14T01:05:18.185606615Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 14 01:05:18.185899 kubelet[3316]: E0114 01:05:18.185854 3316 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 14 01:05:18.185899 kubelet[3316]: E0114 01:05:18.185896 3316 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 14 01:05:18.186181 kubelet[3316]: E0114 01:05:18.186000 3316 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-gdh9l_calico-system(4395ad87-346f-47f3-8e06-f63944f13a5d): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 14 01:05:18.187602 containerd[1695]: time="2026-01-14T01:05:18.187447385Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 14 01:05:18.541995 containerd[1695]: time="2026-01-14T01:05:18.541715514Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:05:18.545154 containerd[1695]: time="2026-01-14T01:05:18.545036752Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 14 01:05:18.545154 containerd[1695]: time="2026-01-14T01:05:18.545119957Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 14 01:05:18.545432 kubelet[3316]: E0114 01:05:18.545389 3316 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 14 01:05:18.545476 kubelet[3316]: E0114 01:05:18.545433 3316 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 14 01:05:18.546278 kubelet[3316]: E0114 01:05:18.545502 3316 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-gdh9l_calico-system(4395ad87-346f-47f3-8e06-f63944f13a5d): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 14 01:05:18.546278 kubelet[3316]: E0114 01:05:18.545539 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-gdh9l" podUID="4395ad87-346f-47f3-8e06-f63944f13a5d" Jan 14 01:05:19.856697 containerd[1695]: time="2026-01-14T01:05:19.856488524Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 14 01:05:20.192208 containerd[1695]: time="2026-01-14T01:05:20.192103168Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:05:20.194020 containerd[1695]: time="2026-01-14T01:05:20.193795633Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 14 01:05:20.194020 containerd[1695]: time="2026-01-14T01:05:20.193870508Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 14 01:05:20.194170 kubelet[3316]: E0114 01:05:20.194129 3316 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:05:20.194518 kubelet[3316]: E0114 01:05:20.194181 3316 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:05:20.194518 kubelet[3316]: E0114 01:05:20.194262 3316 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-694bbc95d6-tk96l_calico-apiserver(f26be41d-3305-4b21-9d76-bde121cc2cce): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 14 01:05:20.194518 kubelet[3316]: E0114 01:05:20.194291 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-694bbc95d6-tk96l" podUID="f26be41d-3305-4b21-9d76-bde121cc2cce" Jan 14 01:05:21.037315 update_engine[1668]: I20260114 01:05:21.037238 1668 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jan 14 01:05:21.037315 update_engine[1668]: I20260114 01:05:21.037317 1668 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jan 14 01:05:21.037685 update_engine[1668]: I20260114 01:05:21.037616 1668 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jan 14 01:05:21.043991 update_engine[1668]: E20260114 01:05:21.043953 1668 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled (Domain name not found) Jan 14 01:05:21.044117 update_engine[1668]: I20260114 01:05:21.044027 1668 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 Jan 14 01:05:21.858844 containerd[1695]: time="2026-01-14T01:05:21.858796850Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 14 01:05:22.188807 containerd[1695]: time="2026-01-14T01:05:22.188209930Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:05:22.190106 containerd[1695]: time="2026-01-14T01:05:22.190070411Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 14 01:05:22.190924 containerd[1695]: time="2026-01-14T01:05:22.190156989Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 14 01:05:22.190970 kubelet[3316]: E0114 01:05:22.190933 3316 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:05:22.191230 kubelet[3316]: E0114 01:05:22.190975 3316 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:05:22.191230 kubelet[3316]: E0114 01:05:22.191083 3316 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-694bbc95d6-42t9l_calico-apiserver(b7d50268-8797-458d-a912-f7456846c1f2): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 14 01:05:22.191230 kubelet[3316]: E0114 01:05:22.191110 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-694bbc95d6-42t9l" podUID="b7d50268-8797-458d-a912-f7456846c1f2" Jan 14 01:05:22.856304 containerd[1695]: time="2026-01-14T01:05:22.856250691Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 14 01:05:23.178314 containerd[1695]: time="2026-01-14T01:05:23.178216432Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:05:23.180656 containerd[1695]: time="2026-01-14T01:05:23.180624129Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 14 01:05:23.180733 containerd[1695]: time="2026-01-14T01:05:23.180666194Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 14 01:05:23.180899 kubelet[3316]: E0114 01:05:23.180873 3316 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 14 01:05:23.180945 kubelet[3316]: E0114 01:05:23.180911 3316 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 14 01:05:23.180997 kubelet[3316]: E0114 01:05:23.180982 3316 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-vc7sq_calico-system(95ab78ae-ca97-4cab-9490-03b0a50f740c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 14 01:05:23.181037 kubelet[3316]: E0114 01:05:23.181020 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-vc7sq" podUID="95ab78ae-ca97-4cab-9490-03b0a50f740c" Jan 14 01:05:28.856706 kubelet[3316]: E0114 01:05:28.856450 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-86656bdf75-c9kjr" podUID="81b57aeb-9645-4cab-a7a2-931a98fd5ce6" Jan 14 01:05:28.857631 kubelet[3316]: E0114 01:05:28.856938 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-d94f85b8d-frfrp" podUID="51002e26-f95b-49f1-8f48-be4a381935eb" Jan 14 01:05:30.856857 kubelet[3316]: E0114 01:05:30.856737 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-gdh9l" podUID="4395ad87-346f-47f3-8e06-f63944f13a5d" Jan 14 01:05:31.041063 update_engine[1668]: I20260114 01:05:31.040257 1668 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jan 14 01:05:31.041063 update_engine[1668]: I20260114 01:05:31.040337 1668 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jan 14 01:05:31.041063 update_engine[1668]: I20260114 01:05:31.040619 1668 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jan 14 01:05:31.047012 update_engine[1668]: E20260114 01:05:31.046970 1668 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled (Domain name not found) Jan 14 01:05:31.047125 update_engine[1668]: I20260114 01:05:31.047044 1668 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Jan 14 01:05:31.047125 update_engine[1668]: I20260114 01:05:31.047065 1668 omaha_request_action.cc:617] Omaha request response: Jan 14 01:05:31.047163 update_engine[1668]: E20260114 01:05:31.047143 1668 omaha_request_action.cc:636] Omaha request network transfer failed. Jan 14 01:05:31.047181 update_engine[1668]: I20260114 01:05:31.047171 1668 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. Jan 14 01:05:31.047181 update_engine[1668]: I20260114 01:05:31.047176 1668 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Jan 14 01:05:31.047214 update_engine[1668]: I20260114 01:05:31.047180 1668 update_attempter.cc:306] Processing Done. Jan 14 01:05:31.047214 update_engine[1668]: E20260114 01:05:31.047191 1668 update_attempter.cc:619] Update failed. Jan 14 01:05:31.047214 update_engine[1668]: I20260114 01:05:31.047196 1668 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse Jan 14 01:05:31.047214 update_engine[1668]: I20260114 01:05:31.047199 1668 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) Jan 14 01:05:31.047214 update_engine[1668]: I20260114 01:05:31.047204 1668 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. Jan 14 01:05:31.047299 update_engine[1668]: I20260114 01:05:31.047259 1668 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Jan 14 01:05:31.047299 update_engine[1668]: I20260114 01:05:31.047278 1668 omaha_request_action.cc:271] Posting an Omaha request to disabled Jan 14 01:05:31.047299 update_engine[1668]: I20260114 01:05:31.047282 1668 omaha_request_action.cc:272] Request: Jan 14 01:05:31.047299 update_engine[1668]: Jan 14 01:05:31.047299 update_engine[1668]: Jan 14 01:05:31.047299 update_engine[1668]: Jan 14 01:05:31.047299 update_engine[1668]: Jan 14 01:05:31.047299 update_engine[1668]: Jan 14 01:05:31.047299 update_engine[1668]: Jan 14 01:05:31.047299 update_engine[1668]: I20260114 01:05:31.047286 1668 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jan 14 01:05:31.047450 update_engine[1668]: I20260114 01:05:31.047303 1668 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jan 14 01:05:31.047607 update_engine[1668]: I20260114 01:05:31.047520 1668 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jan 14 01:05:31.049185 locksmithd[1732]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 Jan 14 01:05:31.055202 update_engine[1668]: E20260114 01:05:31.053905 1668 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled (Domain name not found) Jan 14 01:05:31.055202 update_engine[1668]: I20260114 01:05:31.053987 1668 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Jan 14 01:05:31.055202 update_engine[1668]: I20260114 01:05:31.053996 1668 omaha_request_action.cc:617] Omaha request response: Jan 14 01:05:31.055202 update_engine[1668]: I20260114 01:05:31.054003 1668 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Jan 14 01:05:31.055202 update_engine[1668]: I20260114 01:05:31.054008 1668 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Jan 14 01:05:31.055202 update_engine[1668]: I20260114 01:05:31.054013 1668 update_attempter.cc:306] Processing Done. Jan 14 01:05:31.055202 update_engine[1668]: I20260114 01:05:31.054018 1668 update_attempter.cc:310] Error event sent. Jan 14 01:05:31.055202 update_engine[1668]: I20260114 01:05:31.054026 1668 update_check_scheduler.cc:74] Next update check in 47m11s Jan 14 01:05:31.055409 locksmithd[1732]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 Jan 14 01:05:33.857589 kubelet[3316]: E0114 01:05:33.857505 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-694bbc95d6-tk96l" podUID="f26be41d-3305-4b21-9d76-bde121cc2cce" Jan 14 01:05:33.857942 kubelet[3316]: E0114 01:05:33.857841 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-vc7sq" podUID="95ab78ae-ca97-4cab-9490-03b0a50f740c" Jan 14 01:05:35.858793 kubelet[3316]: E0114 01:05:35.858560 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-694bbc95d6-42t9l" podUID="b7d50268-8797-458d-a912-f7456846c1f2" Jan 14 01:05:41.858113 kubelet[3316]: E0114 01:05:41.857432 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-d94f85b8d-frfrp" podUID="51002e26-f95b-49f1-8f48-be4a381935eb" Jan 14 01:05:42.855785 kubelet[3316]: E0114 01:05:42.855736 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-86656bdf75-c9kjr" podUID="81b57aeb-9645-4cab-a7a2-931a98fd5ce6" Jan 14 01:05:45.858312 kubelet[3316]: E0114 01:05:45.857806 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-vc7sq" podUID="95ab78ae-ca97-4cab-9490-03b0a50f740c" Jan 14 01:05:45.859744 kubelet[3316]: E0114 01:05:45.858524 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-gdh9l" podUID="4395ad87-346f-47f3-8e06-f63944f13a5d" Jan 14 01:05:47.857132 kubelet[3316]: E0114 01:05:47.856550 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-694bbc95d6-tk96l" podUID="f26be41d-3305-4b21-9d76-bde121cc2cce" Jan 14 01:05:48.855654 kubelet[3316]: E0114 01:05:48.855614 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-694bbc95d6-42t9l" podUID="b7d50268-8797-458d-a912-f7456846c1f2" Jan 14 01:05:53.859310 kubelet[3316]: E0114 01:05:53.859269 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-d94f85b8d-frfrp" podUID="51002e26-f95b-49f1-8f48-be4a381935eb" Jan 14 01:05:55.857590 kubelet[3316]: E0114 01:05:55.857215 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-86656bdf75-c9kjr" podUID="81b57aeb-9645-4cab-a7a2-931a98fd5ce6" Jan 14 01:05:56.858382 kubelet[3316]: E0114 01:05:56.858325 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-gdh9l" podUID="4395ad87-346f-47f3-8e06-f63944f13a5d" Jan 14 01:05:58.855363 kubelet[3316]: E0114 01:05:58.855296 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-vc7sq" podUID="95ab78ae-ca97-4cab-9490-03b0a50f740c" Jan 14 01:06:00.856495 kubelet[3316]: E0114 01:06:00.856462 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-694bbc95d6-42t9l" podUID="b7d50268-8797-458d-a912-f7456846c1f2" Jan 14 01:06:00.857583 containerd[1695]: time="2026-01-14T01:06:00.857559299Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 14 01:06:01.222621 containerd[1695]: time="2026-01-14T01:06:01.222521539Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:06:01.224180 containerd[1695]: time="2026-01-14T01:06:01.224148393Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 14 01:06:01.224252 containerd[1695]: time="2026-01-14T01:06:01.224219162Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 14 01:06:01.224463 kubelet[3316]: E0114 01:06:01.224363 3316 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:06:01.224463 kubelet[3316]: E0114 01:06:01.224405 3316 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:06:01.225209 kubelet[3316]: E0114 01:06:01.224634 3316 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-694bbc95d6-tk96l_calico-apiserver(f26be41d-3305-4b21-9d76-bde121cc2cce): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 14 01:06:01.225209 kubelet[3316]: E0114 01:06:01.224667 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-694bbc95d6-tk96l" podUID="f26be41d-3305-4b21-9d76-bde121cc2cce" Jan 14 01:06:05.857109 containerd[1695]: time="2026-01-14T01:06:05.856899626Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 14 01:06:06.194745 containerd[1695]: time="2026-01-14T01:06:06.194460262Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:06:06.196626 containerd[1695]: time="2026-01-14T01:06:06.196597396Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 14 01:06:06.196692 containerd[1695]: time="2026-01-14T01:06:06.196668272Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 14 01:06:06.196874 kubelet[3316]: E0114 01:06:06.196842 3316 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 14 01:06:06.197116 kubelet[3316]: E0114 01:06:06.196884 3316 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 14 01:06:06.197116 kubelet[3316]: E0114 01:06:06.196955 3316 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-d94f85b8d-frfrp_calico-system(51002e26-f95b-49f1-8f48-be4a381935eb): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 14 01:06:06.197856 containerd[1695]: time="2026-01-14T01:06:06.197688215Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 14 01:06:06.505445 containerd[1695]: time="2026-01-14T01:06:06.505167559Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:06:06.508068 containerd[1695]: time="2026-01-14T01:06:06.507556413Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 14 01:06:06.508233 containerd[1695]: time="2026-01-14T01:06:06.507619001Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 14 01:06:06.508462 kubelet[3316]: E0114 01:06:06.508424 3316 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 14 01:06:06.508515 kubelet[3316]: E0114 01:06:06.508472 3316 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 14 01:06:06.508565 kubelet[3316]: E0114 01:06:06.508547 3316 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-d94f85b8d-frfrp_calico-system(51002e26-f95b-49f1-8f48-be4a381935eb): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 14 01:06:06.508609 kubelet[3316]: E0114 01:06:06.508588 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-d94f85b8d-frfrp" podUID="51002e26-f95b-49f1-8f48-be4a381935eb" Jan 14 01:06:07.856732 containerd[1695]: time="2026-01-14T01:06:07.856666670Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 14 01:06:08.184220 containerd[1695]: time="2026-01-14T01:06:08.184118330Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:06:08.185855 containerd[1695]: time="2026-01-14T01:06:08.185818372Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 14 01:06:08.185939 containerd[1695]: time="2026-01-14T01:06:08.185895237Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 14 01:06:08.186127 kubelet[3316]: E0114 01:06:08.186095 3316 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 14 01:06:08.186372 kubelet[3316]: E0114 01:06:08.186138 3316 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 14 01:06:08.186372 kubelet[3316]: E0114 01:06:08.186202 3316 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-gdh9l_calico-system(4395ad87-346f-47f3-8e06-f63944f13a5d): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 14 01:06:08.189334 containerd[1695]: time="2026-01-14T01:06:08.189310444Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 14 01:06:08.549113 containerd[1695]: time="2026-01-14T01:06:08.548921195Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:06:08.550689 containerd[1695]: time="2026-01-14T01:06:08.550661299Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 14 01:06:08.550758 containerd[1695]: time="2026-01-14T01:06:08.550732539Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 14 01:06:08.550980 kubelet[3316]: E0114 01:06:08.550939 3316 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 14 01:06:08.551033 kubelet[3316]: E0114 01:06:08.550984 3316 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 14 01:06:08.551079 kubelet[3316]: E0114 01:06:08.551058 3316 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-gdh9l_calico-system(4395ad87-346f-47f3-8e06-f63944f13a5d): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 14 01:06:08.551108 kubelet[3316]: E0114 01:06:08.551093 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-gdh9l" podUID="4395ad87-346f-47f3-8e06-f63944f13a5d" Jan 14 01:06:08.856686 containerd[1695]: time="2026-01-14T01:06:08.856422523Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 14 01:06:09.188170 containerd[1695]: time="2026-01-14T01:06:09.188067370Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:06:09.189738 containerd[1695]: time="2026-01-14T01:06:09.189708233Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 14 01:06:09.189805 containerd[1695]: time="2026-01-14T01:06:09.189785418Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 14 01:06:09.190061 kubelet[3316]: E0114 01:06:09.189935 3316 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 14 01:06:09.190061 kubelet[3316]: E0114 01:06:09.189986 3316 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 14 01:06:09.190354 kubelet[3316]: E0114 01:06:09.190290 3316 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-86656bdf75-c9kjr_calico-system(81b57aeb-9645-4cab-a7a2-931a98fd5ce6): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 14 01:06:09.190354 kubelet[3316]: E0114 01:06:09.190325 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-86656bdf75-c9kjr" podUID="81b57aeb-9645-4cab-a7a2-931a98fd5ce6" Jan 14 01:06:13.857526 containerd[1695]: time="2026-01-14T01:06:13.857328353Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 14 01:06:14.186934 containerd[1695]: time="2026-01-14T01:06:14.186565001Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:06:14.190178 containerd[1695]: time="2026-01-14T01:06:14.190096976Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 14 01:06:14.190264 containerd[1695]: time="2026-01-14T01:06:14.190163478Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 14 01:06:14.190332 kubelet[3316]: E0114 01:06:14.190304 3316 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 14 01:06:14.190621 kubelet[3316]: E0114 01:06:14.190342 3316 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 14 01:06:14.190621 kubelet[3316]: E0114 01:06:14.190404 3316 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-vc7sq_calico-system(95ab78ae-ca97-4cab-9490-03b0a50f740c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 14 01:06:14.190621 kubelet[3316]: E0114 01:06:14.190434 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-vc7sq" podUID="95ab78ae-ca97-4cab-9490-03b0a50f740c" Jan 14 01:06:14.857084 containerd[1695]: time="2026-01-14T01:06:14.857031880Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 14 01:06:15.182743 containerd[1695]: time="2026-01-14T01:06:15.182467913Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:06:15.184573 containerd[1695]: time="2026-01-14T01:06:15.184539685Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 14 01:06:15.184678 containerd[1695]: time="2026-01-14T01:06:15.184609359Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 14 01:06:15.184940 kubelet[3316]: E0114 01:06:15.184736 3316 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:06:15.184940 kubelet[3316]: E0114 01:06:15.184782 3316 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:06:15.184940 kubelet[3316]: E0114 01:06:15.184853 3316 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-694bbc95d6-42t9l_calico-apiserver(b7d50268-8797-458d-a912-f7456846c1f2): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 14 01:06:15.184940 kubelet[3316]: E0114 01:06:15.184883 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-694bbc95d6-42t9l" podUID="b7d50268-8797-458d-a912-f7456846c1f2" Jan 14 01:06:15.857265 kubelet[3316]: E0114 01:06:15.857232 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-694bbc95d6-tk96l" podUID="f26be41d-3305-4b21-9d76-bde121cc2cce" Jan 14 01:06:19.858446 kubelet[3316]: E0114 01:06:19.858413 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-d94f85b8d-frfrp" podUID="51002e26-f95b-49f1-8f48-be4a381935eb" Jan 14 01:06:19.858998 kubelet[3316]: E0114 01:06:19.858954 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-gdh9l" podUID="4395ad87-346f-47f3-8e06-f63944f13a5d" Jan 14 01:06:23.857190 kubelet[3316]: E0114 01:06:23.856903 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-86656bdf75-c9kjr" podUID="81b57aeb-9645-4cab-a7a2-931a98fd5ce6" Jan 14 01:06:26.857067 kubelet[3316]: E0114 01:06:26.857010 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-694bbc95d6-tk96l" podUID="f26be41d-3305-4b21-9d76-bde121cc2cce" Jan 14 01:06:28.856721 kubelet[3316]: E0114 01:06:28.856477 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-vc7sq" podUID="95ab78ae-ca97-4cab-9490-03b0a50f740c" Jan 14 01:06:29.858353 kubelet[3316]: E0114 01:06:29.858208 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-694bbc95d6-42t9l" podUID="b7d50268-8797-458d-a912-f7456846c1f2" Jan 14 01:06:31.857737 kubelet[3316]: E0114 01:06:31.857032 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-d94f85b8d-frfrp" podUID="51002e26-f95b-49f1-8f48-be4a381935eb" Jan 14 01:06:33.859131 kubelet[3316]: E0114 01:06:33.858892 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-gdh9l" podUID="4395ad87-346f-47f3-8e06-f63944f13a5d" Jan 14 01:06:35.856601 kubelet[3316]: E0114 01:06:35.856285 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-86656bdf75-c9kjr" podUID="81b57aeb-9645-4cab-a7a2-931a98fd5ce6" Jan 14 01:06:39.857148 kubelet[3316]: E0114 01:06:39.857016 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-vc7sq" podUID="95ab78ae-ca97-4cab-9490-03b0a50f740c" Jan 14 01:06:41.858114 kubelet[3316]: E0114 01:06:41.857645 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-694bbc95d6-tk96l" podUID="f26be41d-3305-4b21-9d76-bde121cc2cce" Jan 14 01:06:42.855376 kubelet[3316]: E0114 01:06:42.855270 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-694bbc95d6-42t9l" podUID="b7d50268-8797-458d-a912-f7456846c1f2" Jan 14 01:06:44.855851 kubelet[3316]: E0114 01:06:44.855804 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-gdh9l" podUID="4395ad87-346f-47f3-8e06-f63944f13a5d" Jan 14 01:06:45.858423 kubelet[3316]: E0114 01:06:45.858380 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-d94f85b8d-frfrp" podUID="51002e26-f95b-49f1-8f48-be4a381935eb" Jan 14 01:06:47.855601 kubelet[3316]: E0114 01:06:47.855565 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-86656bdf75-c9kjr" podUID="81b57aeb-9645-4cab-a7a2-931a98fd5ce6" Jan 14 01:06:50.856227 kubelet[3316]: E0114 01:06:50.856153 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-vc7sq" podUID="95ab78ae-ca97-4cab-9490-03b0a50f740c" Jan 14 01:06:53.856380 kubelet[3316]: E0114 01:06:53.856324 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-694bbc95d6-42t9l" podUID="b7d50268-8797-458d-a912-f7456846c1f2" Jan 14 01:06:55.856319 kubelet[3316]: E0114 01:06:55.855731 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-694bbc95d6-tk96l" podUID="f26be41d-3305-4b21-9d76-bde121cc2cce" Jan 14 01:06:56.855794 kubelet[3316]: E0114 01:06:56.855751 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-gdh9l" podUID="4395ad87-346f-47f3-8e06-f63944f13a5d" Jan 14 01:06:58.856270 kubelet[3316]: E0114 01:06:58.856230 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-86656bdf75-c9kjr" podUID="81b57aeb-9645-4cab-a7a2-931a98fd5ce6" Jan 14 01:07:00.857806 kubelet[3316]: E0114 01:07:00.857736 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-d94f85b8d-frfrp" podUID="51002e26-f95b-49f1-8f48-be4a381935eb" Jan 14 01:07:03.856866 kubelet[3316]: E0114 01:07:03.856823 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-vc7sq" podUID="95ab78ae-ca97-4cab-9490-03b0a50f740c" Jan 14 01:07:07.857965 kubelet[3316]: E0114 01:07:07.857925 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-694bbc95d6-42t9l" podUID="b7d50268-8797-458d-a912-f7456846c1f2" Jan 14 01:07:07.861076 kubelet[3316]: E0114 01:07:07.860439 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-gdh9l" podUID="4395ad87-346f-47f3-8e06-f63944f13a5d" Jan 14 01:07:09.855746 kubelet[3316]: E0114 01:07:09.855456 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-694bbc95d6-tk96l" podUID="f26be41d-3305-4b21-9d76-bde121cc2cce" Jan 14 01:07:10.855994 kubelet[3316]: E0114 01:07:10.855949 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-86656bdf75-c9kjr" podUID="81b57aeb-9645-4cab-a7a2-931a98fd5ce6" Jan 14 01:07:13.858637 kubelet[3316]: E0114 01:07:13.858397 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-d94f85b8d-frfrp" podUID="51002e26-f95b-49f1-8f48-be4a381935eb" Jan 14 01:07:14.855822 kubelet[3316]: E0114 01:07:14.855739 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-vc7sq" podUID="95ab78ae-ca97-4cab-9490-03b0a50f740c" Jan 14 01:07:19.858717 kubelet[3316]: E0114 01:07:19.858254 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-694bbc95d6-42t9l" podUID="b7d50268-8797-458d-a912-f7456846c1f2" Jan 14 01:07:20.855610 kubelet[3316]: E0114 01:07:20.855514 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-694bbc95d6-tk96l" podUID="f26be41d-3305-4b21-9d76-bde121cc2cce" Jan 14 01:07:22.857065 kubelet[3316]: E0114 01:07:22.857014 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-gdh9l" podUID="4395ad87-346f-47f3-8e06-f63944f13a5d" Jan 14 01:07:23.855699 kubelet[3316]: E0114 01:07:23.855655 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-86656bdf75-c9kjr" podUID="81b57aeb-9645-4cab-a7a2-931a98fd5ce6" Jan 14 01:07:27.856656 containerd[1695]: time="2026-01-14T01:07:27.856606440Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 14 01:07:28.197636 containerd[1695]: time="2026-01-14T01:07:28.197268587Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:07:28.199119 containerd[1695]: time="2026-01-14T01:07:28.199014788Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 14 01:07:28.199119 containerd[1695]: time="2026-01-14T01:07:28.199089389Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 14 01:07:28.199270 kubelet[3316]: E0114 01:07:28.199241 3316 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 14 01:07:28.199498 kubelet[3316]: E0114 01:07:28.199282 3316 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 14 01:07:28.199498 kubelet[3316]: E0114 01:07:28.199423 3316 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-d94f85b8d-frfrp_calico-system(51002e26-f95b-49f1-8f48-be4a381935eb): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 14 01:07:28.201335 containerd[1695]: time="2026-01-14T01:07:28.201314318Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 14 01:07:28.524898 containerd[1695]: time="2026-01-14T01:07:28.524545648Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:07:28.527019 containerd[1695]: time="2026-01-14T01:07:28.526899272Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 14 01:07:28.527239 containerd[1695]: time="2026-01-14T01:07:28.527161931Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 14 01:07:28.527409 kubelet[3316]: E0114 01:07:28.527374 3316 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 14 01:07:28.527949 kubelet[3316]: E0114 01:07:28.527412 3316 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 14 01:07:28.527949 kubelet[3316]: E0114 01:07:28.527480 3316 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-d94f85b8d-frfrp_calico-system(51002e26-f95b-49f1-8f48-be4a381935eb): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 14 01:07:28.527949 kubelet[3316]: E0114 01:07:28.527510 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-d94f85b8d-frfrp" podUID="51002e26-f95b-49f1-8f48-be4a381935eb" Jan 14 01:07:29.857615 kubelet[3316]: E0114 01:07:29.857134 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-vc7sq" podUID="95ab78ae-ca97-4cab-9490-03b0a50f740c" Jan 14 01:07:33.857558 kubelet[3316]: E0114 01:07:33.857508 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-694bbc95d6-42t9l" podUID="b7d50268-8797-458d-a912-f7456846c1f2" Jan 14 01:07:34.858300 containerd[1695]: time="2026-01-14T01:07:34.858272057Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 14 01:07:35.215898 containerd[1695]: time="2026-01-14T01:07:35.215508825Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:07:35.217488 containerd[1695]: time="2026-01-14T01:07:35.217454080Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 14 01:07:35.217546 containerd[1695]: time="2026-01-14T01:07:35.217525892Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 14 01:07:35.218206 kubelet[3316]: E0114 01:07:35.217748 3316 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:07:35.218206 kubelet[3316]: E0114 01:07:35.217792 3316 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:07:35.218206 kubelet[3316]: E0114 01:07:35.218168 3316 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-694bbc95d6-tk96l_calico-apiserver(f26be41d-3305-4b21-9d76-bde121cc2cce): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 14 01:07:35.218206 kubelet[3316]: E0114 01:07:35.218199 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-694bbc95d6-tk96l" podUID="f26be41d-3305-4b21-9d76-bde121cc2cce" Jan 14 01:07:35.218512 containerd[1695]: time="2026-01-14T01:07:35.218041477Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 14 01:07:35.548066 containerd[1695]: time="2026-01-14T01:07:35.547932669Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:07:35.550098 containerd[1695]: time="2026-01-14T01:07:35.550026483Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 14 01:07:35.550180 containerd[1695]: time="2026-01-14T01:07:35.550126450Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 14 01:07:35.550317 kubelet[3316]: E0114 01:07:35.550283 3316 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 14 01:07:35.550354 kubelet[3316]: E0114 01:07:35.550322 3316 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 14 01:07:35.550400 kubelet[3316]: E0114 01:07:35.550385 3316 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-gdh9l_calico-system(4395ad87-346f-47f3-8e06-f63944f13a5d): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 14 01:07:35.551402 containerd[1695]: time="2026-01-14T01:07:35.551234105Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 14 01:07:35.886605 containerd[1695]: time="2026-01-14T01:07:35.886566247Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:07:35.888887 containerd[1695]: time="2026-01-14T01:07:35.888823800Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 14 01:07:35.889035 containerd[1695]: time="2026-01-14T01:07:35.888860102Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 14 01:07:35.889105 kubelet[3316]: E0114 01:07:35.889074 3316 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 14 01:07:35.889150 kubelet[3316]: E0114 01:07:35.889117 3316 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 14 01:07:35.889199 kubelet[3316]: E0114 01:07:35.889184 3316 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-gdh9l_calico-system(4395ad87-346f-47f3-8e06-f63944f13a5d): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 14 01:07:35.889241 kubelet[3316]: E0114 01:07:35.889221 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-gdh9l" podUID="4395ad87-346f-47f3-8e06-f63944f13a5d" Jan 14 01:07:37.858425 containerd[1695]: time="2026-01-14T01:07:37.858397866Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 14 01:07:38.196680 containerd[1695]: time="2026-01-14T01:07:38.196456486Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:07:38.199929 containerd[1695]: time="2026-01-14T01:07:38.199879218Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 14 01:07:38.200027 containerd[1695]: time="2026-01-14T01:07:38.199958838Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 14 01:07:38.200139 kubelet[3316]: E0114 01:07:38.200097 3316 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 14 01:07:38.200139 kubelet[3316]: E0114 01:07:38.200135 3316 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 14 01:07:38.200423 kubelet[3316]: E0114 01:07:38.200195 3316 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-86656bdf75-c9kjr_calico-system(81b57aeb-9645-4cab-a7a2-931a98fd5ce6): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 14 01:07:38.200423 kubelet[3316]: E0114 01:07:38.200222 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-86656bdf75-c9kjr" podUID="81b57aeb-9645-4cab-a7a2-931a98fd5ce6" Jan 14 01:07:40.858350 kubelet[3316]: E0114 01:07:40.858161 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-d94f85b8d-frfrp" podUID="51002e26-f95b-49f1-8f48-be4a381935eb" Jan 14 01:07:43.276000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-10.0.21.32:22-5.187.35.21:61030 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:07:43.277145 systemd[1]: Started sshd@7-10.0.21.32:22-5.187.35.21:61030.service - OpenSSH per-connection server daemon (5.187.35.21:61030). Jan 14 01:07:43.278458 kernel: kauditd_printk_skb: 52 callbacks suppressed Jan 14 01:07:43.278515 kernel: audit: type=1130 audit(1768352863.276:753): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-10.0.21.32:22-5.187.35.21:61030 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:07:43.858455 containerd[1695]: time="2026-01-14T01:07:43.858277954Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 14 01:07:44.180976 containerd[1695]: time="2026-01-14T01:07:44.179679646Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:07:44.182669 containerd[1695]: time="2026-01-14T01:07:44.182624568Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 14 01:07:44.182753 containerd[1695]: time="2026-01-14T01:07:44.182709161Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 14 01:07:44.182905 kubelet[3316]: E0114 01:07:44.182874 3316 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 14 01:07:44.183151 kubelet[3316]: E0114 01:07:44.182916 3316 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 14 01:07:44.183151 kubelet[3316]: E0114 01:07:44.182976 3316 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-vc7sq_calico-system(95ab78ae-ca97-4cab-9490-03b0a50f740c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 14 01:07:44.183151 kubelet[3316]: E0114 01:07:44.183010 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-vc7sq" podUID="95ab78ae-ca97-4cab-9490-03b0a50f740c" Jan 14 01:07:44.856520 containerd[1695]: time="2026-01-14T01:07:44.856157599Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 14 01:07:45.177030 containerd[1695]: time="2026-01-14T01:07:45.176922066Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:07:45.179204 containerd[1695]: time="2026-01-14T01:07:45.179172648Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 14 01:07:45.179257 containerd[1695]: time="2026-01-14T01:07:45.179243964Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 14 01:07:45.179407 kubelet[3316]: E0114 01:07:45.179376 3316 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:07:45.179465 kubelet[3316]: E0114 01:07:45.179418 3316 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:07:45.179491 kubelet[3316]: E0114 01:07:45.179480 3316 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-694bbc95d6-42t9l_calico-apiserver(b7d50268-8797-458d-a912-f7456846c1f2): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 14 01:07:45.179526 kubelet[3316]: E0114 01:07:45.179508 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-694bbc95d6-42t9l" podUID="b7d50268-8797-458d-a912-f7456846c1f2" Jan 14 01:07:46.087435 sshd[5591]: Connection closed by authenticating user root 5.187.35.21 port 61030 [preauth] Jan 14 01:07:46.087000 audit[5591]: USER_ERR pid=5591 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=5.187.35.21 addr=5.187.35.21 terminal=ssh res=failed' Jan 14 01:07:46.089190 systemd[1]: sshd@7-10.0.21.32:22-5.187.35.21:61030.service: Deactivated successfully. Jan 14 01:07:46.092066 kernel: audit: type=1109 audit(1768352866.087:754): pid=5591 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=5.187.35.21 addr=5.187.35.21 terminal=ssh res=failed' Jan 14 01:07:46.087000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-10.0.21.32:22-5.187.35.21:61030 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:07:46.097066 kernel: audit: type=1131 audit(1768352866.087:755): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-10.0.21.32:22-5.187.35.21:61030 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:07:46.116323 systemd[1]: Started sshd@8-10.0.21.32:22-5.187.35.21:61054.service - OpenSSH per-connection server daemon (5.187.35.21:61054). Jan 14 01:07:46.115000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-10.0.21.32:22-5.187.35.21:61054 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:07:46.120079 kernel: audit: type=1130 audit(1768352866.115:756): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-10.0.21.32:22-5.187.35.21:61054 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:07:46.856515 kubelet[3316]: E0114 01:07:46.856470 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-gdh9l" podUID="4395ad87-346f-47f3-8e06-f63944f13a5d" Jan 14 01:07:48.856729 kubelet[3316]: E0114 01:07:48.856452 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-86656bdf75-c9kjr" podUID="81b57aeb-9645-4cab-a7a2-931a98fd5ce6" Jan 14 01:07:49.009109 sshd[5604]: Connection closed by authenticating user root 5.187.35.21 port 61054 [preauth] Jan 14 01:07:49.010327 systemd[1]: sshd@8-10.0.21.32:22-5.187.35.21:61054.service: Deactivated successfully. Jan 14 01:07:49.008000 audit[5604]: USER_ERR pid=5604 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=5.187.35.21 addr=5.187.35.21 terminal=ssh res=failed' Jan 14 01:07:49.014135 kernel: audit: type=1109 audit(1768352869.008:757): pid=5604 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=5.187.35.21 addr=5.187.35.21 terminal=ssh res=failed' Jan 14 01:07:49.009000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-10.0.21.32:22-5.187.35.21:61054 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:07:49.018064 kernel: audit: type=1131 audit(1768352869.009:758): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-10.0.21.32:22-5.187.35.21:61054 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:07:49.035000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-10.0.21.32:22-5.187.35.21:61084 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:07:49.036317 systemd[1]: Started sshd@9-10.0.21.32:22-5.187.35.21:61084.service - OpenSSH per-connection server daemon (5.187.35.21:61084). Jan 14 01:07:49.040201 kernel: audit: type=1130 audit(1768352869.035:759): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-10.0.21.32:22-5.187.35.21:61084 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:07:49.858212 kubelet[3316]: E0114 01:07:49.858171 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-694bbc95d6-tk96l" podUID="f26be41d-3305-4b21-9d76-bde121cc2cce" Jan 14 01:07:51.271094 sshd[5610]: Connection closed by authenticating user root 5.187.35.21 port 61084 [preauth] Jan 14 01:07:51.270000 audit[5610]: USER_ERR pid=5610 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=5.187.35.21 addr=5.187.35.21 terminal=ssh res=failed' Jan 14 01:07:51.274320 systemd[1]: sshd@9-10.0.21.32:22-5.187.35.21:61084.service: Deactivated successfully. Jan 14 01:07:51.275207 kernel: audit: type=1109 audit(1768352871.270:760): pid=5610 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=5.187.35.21 addr=5.187.35.21 terminal=ssh res=failed' Jan 14 01:07:51.275719 kernel: audit: type=1131 audit(1768352871.274:761): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-10.0.21.32:22-5.187.35.21:61084 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:07:51.274000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-10.0.21.32:22-5.187.35.21:61084 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:07:51.299300 systemd[1]: Started sshd@10-10.0.21.32:22-5.187.35.21:61112.service - OpenSSH per-connection server daemon (5.187.35.21:61112). Jan 14 01:07:51.298000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-10.0.21.32:22-5.187.35.21:61112 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:07:51.304159 kernel: audit: type=1130 audit(1768352871.298:762): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-10.0.21.32:22-5.187.35.21:61112 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:07:54.701081 sshd[5616]: Connection closed by authenticating user root 5.187.35.21 port 61112 [preauth] Jan 14 01:07:54.700000 audit[5616]: USER_ERR pid=5616 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=5.187.35.21 addr=5.187.35.21 terminal=ssh res=failed' Jan 14 01:07:54.704544 systemd[1]: sshd@10-10.0.21.32:22-5.187.35.21:61112.service: Deactivated successfully. Jan 14 01:07:54.707193 kernel: audit: type=1109 audit(1768352874.700:763): pid=5616 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=5.187.35.21 addr=5.187.35.21 terminal=ssh res=failed' Jan 14 01:07:54.707244 kernel: audit: type=1131 audit(1768352874.703:764): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-10.0.21.32:22-5.187.35.21:61112 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:07:54.703000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-10.0.21.32:22-5.187.35.21:61112 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:07:54.745888 systemd[1]: Started sshd@11-10.0.21.32:22-5.187.35.21:55158.service - OpenSSH per-connection server daemon (5.187.35.21:55158). Jan 14 01:07:54.745000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-10.0.21.32:22-5.187.35.21:55158 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:07:54.750285 kernel: audit: type=1130 audit(1768352874.745:765): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-10.0.21.32:22-5.187.35.21:55158 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:07:55.857988 kubelet[3316]: E0114 01:07:55.857922 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-d94f85b8d-frfrp" podUID="51002e26-f95b-49f1-8f48-be4a381935eb" Jan 14 01:07:55.859108 kubelet[3316]: E0114 01:07:55.858606 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-vc7sq" podUID="95ab78ae-ca97-4cab-9490-03b0a50f740c" Jan 14 01:07:57.391794 sshd[5637]: Connection closed by authenticating user root 5.187.35.21 port 55158 [preauth] Jan 14 01:07:57.391000 audit[5637]: USER_ERR pid=5637 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=5.187.35.21 addr=5.187.35.21 terminal=ssh res=failed' Jan 14 01:07:57.396179 kernel: audit: type=1109 audit(1768352877.391:766): pid=5637 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=5.187.35.21 addr=5.187.35.21 terminal=ssh res=failed' Jan 14 01:07:57.396845 systemd[1]: sshd@11-10.0.21.32:22-5.187.35.21:55158.service: Deactivated successfully. Jan 14 01:07:57.400471 kernel: audit: type=1131 audit(1768352877.396:767): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-10.0.21.32:22-5.187.35.21:55158 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:07:57.396000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-10.0.21.32:22-5.187.35.21:55158 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:07:57.418000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-10.0.21.32:22-5.187.35.21:55168 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:07:57.419674 systemd[1]: Started sshd@12-10.0.21.32:22-5.187.35.21:55168.service - OpenSSH per-connection server daemon (5.187.35.21:55168). Jan 14 01:07:57.423118 kernel: audit: type=1130 audit(1768352877.418:768): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-10.0.21.32:22-5.187.35.21:55168 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:07:57.856615 kubelet[3316]: E0114 01:07:57.856555 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-gdh9l" podUID="4395ad87-346f-47f3-8e06-f63944f13a5d" Jan 14 01:07:58.856045 kubelet[3316]: E0114 01:07:58.856000 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-694bbc95d6-42t9l" podUID="b7d50268-8797-458d-a912-f7456846c1f2" Jan 14 01:08:00.436144 sshd[5643]: Connection closed by authenticating user root 5.187.35.21 port 55168 [preauth] Jan 14 01:08:00.434000 audit[5643]: USER_ERR pid=5643 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=5.187.35.21 addr=5.187.35.21 terminal=ssh res=failed' Jan 14 01:08:00.440077 kernel: audit: type=1109 audit(1768352880.434:769): pid=5643 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=5.187.35.21 addr=5.187.35.21 terminal=ssh res=failed' Jan 14 01:08:00.440600 systemd[1]: sshd@12-10.0.21.32:22-5.187.35.21:55168.service: Deactivated successfully. Jan 14 01:08:00.440000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-10.0.21.32:22-5.187.35.21:55168 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:00.445377 kernel: audit: type=1131 audit(1768352880.440:770): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-10.0.21.32:22-5.187.35.21:55168 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:00.463000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-10.0.21.32:22-5.187.35.21:55210 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:00.464171 systemd[1]: Started sshd@13-10.0.21.32:22-5.187.35.21:55210.service - OpenSSH per-connection server daemon (5.187.35.21:55210). Jan 14 01:08:00.468065 kernel: audit: type=1130 audit(1768352880.463:771): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-10.0.21.32:22-5.187.35.21:55210 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:01.858953 kubelet[3316]: E0114 01:08:01.858920 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-694bbc95d6-tk96l" podUID="f26be41d-3305-4b21-9d76-bde121cc2cce" Jan 14 01:08:02.855864 kubelet[3316]: E0114 01:08:02.855677 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-86656bdf75-c9kjr" podUID="81b57aeb-9645-4cab-a7a2-931a98fd5ce6" Jan 14 01:08:03.556734 sshd[5649]: Connection closed by authenticating user root 5.187.35.21 port 55210 [preauth] Jan 14 01:08:03.555000 audit[5649]: USER_ERR pid=5649 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=5.187.35.21 addr=5.187.35.21 terminal=ssh res=failed' Jan 14 01:08:03.561356 kernel: audit: type=1109 audit(1768352883.555:772): pid=5649 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=5.187.35.21 addr=5.187.35.21 terminal=ssh res=failed' Jan 14 01:08:03.561632 systemd[1]: sshd@13-10.0.21.32:22-5.187.35.21:55210.service: Deactivated successfully. Jan 14 01:08:03.560000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-10.0.21.32:22-5.187.35.21:55210 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:03.565198 kernel: audit: type=1131 audit(1768352883.560:773): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-10.0.21.32:22-5.187.35.21:55210 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:03.583532 systemd[1]: Started sshd@14-10.0.21.32:22-5.187.35.21:63898.service - OpenSSH per-connection server daemon (5.187.35.21:63898). Jan 14 01:08:03.584000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-10.0.21.32:22-5.187.35.21:63898 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:03.589073 kernel: audit: type=1130 audit(1768352883.584:774): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-10.0.21.32:22-5.187.35.21:63898 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:06.441472 sshd[5657]: Connection closed by authenticating user root 5.187.35.21 port 63898 [preauth] Jan 14 01:08:06.440000 audit[5657]: USER_ERR pid=5657 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=5.187.35.21 addr=5.187.35.21 terminal=ssh res=failed' Jan 14 01:08:06.443804 systemd[1]: sshd@14-10.0.21.32:22-5.187.35.21:63898.service: Deactivated successfully. Jan 14 01:08:06.447071 kernel: audit: type=1109 audit(1768352886.440:775): pid=5657 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=5.187.35.21 addr=5.187.35.21 terminal=ssh res=failed' Jan 14 01:08:06.442000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-10.0.21.32:22-5.187.35.21:63898 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:06.452138 kernel: audit: type=1131 audit(1768352886.442:776): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-10.0.21.32:22-5.187.35.21:63898 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:06.465109 systemd[1]: Started sshd@15-10.0.21.32:22-5.187.35.21:63918.service - OpenSSH per-connection server daemon (5.187.35.21:63918). Jan 14 01:08:06.464000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-10.0.21.32:22-5.187.35.21:63918 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:06.469183 kernel: audit: type=1130 audit(1768352886.464:777): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-10.0.21.32:22-5.187.35.21:63918 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:06.856256 kubelet[3316]: E0114 01:08:06.855562 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-vc7sq" podUID="95ab78ae-ca97-4cab-9490-03b0a50f740c" Jan 14 01:08:06.856871 kubelet[3316]: E0114 01:08:06.856828 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-d94f85b8d-frfrp" podUID="51002e26-f95b-49f1-8f48-be4a381935eb" Jan 14 01:08:09.536067 sshd[5663]: Connection closed by authenticating user root 5.187.35.21 port 63918 [preauth] Jan 14 01:08:09.534000 audit[5663]: USER_ERR pid=5663 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=5.187.35.21 addr=5.187.35.21 terminal=ssh res=failed' Jan 14 01:08:09.540064 kernel: audit: type=1109 audit(1768352889.534:778): pid=5663 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=5.187.35.21 addr=5.187.35.21 terminal=ssh res=failed' Jan 14 01:08:09.539907 systemd[1]: sshd@15-10.0.21.32:22-5.187.35.21:63918.service: Deactivated successfully. Jan 14 01:08:09.539000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-10.0.21.32:22-5.187.35.21:63918 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:09.547063 kernel: audit: type=1131 audit(1768352889.539:779): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-10.0.21.32:22-5.187.35.21:63918 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:09.573311 systemd[1]: Started sshd@16-10.0.21.32:22-5.187.35.21:63942.service - OpenSSH per-connection server daemon (5.187.35.21:63942). Jan 14 01:08:09.572000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-10.0.21.32:22-5.187.35.21:63942 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:09.577069 kernel: audit: type=1130 audit(1768352889.572:780): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-10.0.21.32:22-5.187.35.21:63942 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:09.859140 kubelet[3316]: E0114 01:08:09.859094 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-gdh9l" podUID="4395ad87-346f-47f3-8e06-f63944f13a5d" Jan 14 01:08:12.562619 sshd[5669]: Connection closed by authenticating user root 5.187.35.21 port 63942 [preauth] Jan 14 01:08:12.564000 audit[5669]: USER_ERR pid=5669 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=5.187.35.21 addr=5.187.35.21 terminal=ssh res=failed' Jan 14 01:08:12.568074 kernel: audit: type=1109 audit(1768352892.564:781): pid=5669 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=5.187.35.21 addr=5.187.35.21 terminal=ssh res=failed' Jan 14 01:08:12.568069 systemd[1]: sshd@16-10.0.21.32:22-5.187.35.21:63942.service: Deactivated successfully. Jan 14 01:08:12.568000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-10.0.21.32:22-5.187.35.21:63942 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:12.574364 kernel: audit: type=1131 audit(1768352892.568:782): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-10.0.21.32:22-5.187.35.21:63942 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:12.593121 systemd[1]: Started sshd@17-10.0.21.32:22-5.187.35.21:43242.service - OpenSSH per-connection server daemon (5.187.35.21:43242). Jan 14 01:08:12.593000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-10.0.21.32:22-5.187.35.21:43242 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:12.597061 kernel: audit: type=1130 audit(1768352892.593:783): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-10.0.21.32:22-5.187.35.21:43242 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:12.857421 kubelet[3316]: E0114 01:08:12.856515 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-694bbc95d6-42t9l" podUID="b7d50268-8797-458d-a912-f7456846c1f2" Jan 14 01:08:13.855687 kubelet[3316]: E0114 01:08:13.855580 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-86656bdf75-c9kjr" podUID="81b57aeb-9645-4cab-a7a2-931a98fd5ce6" Jan 14 01:08:14.856274 kubelet[3316]: E0114 01:08:14.856241 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-694bbc95d6-tk96l" podUID="f26be41d-3305-4b21-9d76-bde121cc2cce" Jan 14 01:08:15.290000 audit[5702]: USER_ERR pid=5702 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=5.187.35.21 addr=5.187.35.21 terminal=ssh res=failed' Jan 14 01:08:15.290340 sshd[5702]: Connection closed by authenticating user root 5.187.35.21 port 43242 [preauth] Jan 14 01:08:15.294066 kernel: audit: type=1109 audit(1768352895.290:784): pid=5702 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=5.187.35.21 addr=5.187.35.21 terminal=ssh res=failed' Jan 14 01:08:15.294198 systemd[1]: sshd@17-10.0.21.32:22-5.187.35.21:43242.service: Deactivated successfully. Jan 14 01:08:15.294000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-10.0.21.32:22-5.187.35.21:43242 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:15.300066 kernel: audit: type=1131 audit(1768352895.294:785): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-10.0.21.32:22-5.187.35.21:43242 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:15.321018 systemd[1]: Started sshd@18-10.0.21.32:22-5.187.35.21:43256.service - OpenSSH per-connection server daemon (5.187.35.21:43256). Jan 14 01:08:15.321000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-10.0.21.32:22-5.187.35.21:43256 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:15.326363 kernel: audit: type=1130 audit(1768352895.321:786): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-10.0.21.32:22-5.187.35.21:43256 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:18.720511 sshd[5708]: Connection closed by authenticating user root 5.187.35.21 port 43256 [preauth] Jan 14 01:08:18.720000 audit[5708]: USER_ERR pid=5708 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=5.187.35.21 addr=5.187.35.21 terminal=ssh res=failed' Jan 14 01:08:18.725076 kernel: audit: type=1109 audit(1768352898.720:787): pid=5708 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=5.187.35.21 addr=5.187.35.21 terminal=ssh res=failed' Jan 14 01:08:18.725231 systemd[1]: sshd@18-10.0.21.32:22-5.187.35.21:43256.service: Deactivated successfully. Jan 14 01:08:18.725000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-10.0.21.32:22-5.187.35.21:43256 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:18.731097 kernel: audit: type=1131 audit(1768352898.725:788): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-10.0.21.32:22-5.187.35.21:43256 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:18.747000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-10.0.21.32:22-5.187.35.21:43270 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:18.747275 systemd[1]: Started sshd@19-10.0.21.32:22-5.187.35.21:43270.service - OpenSSH per-connection server daemon (5.187.35.21:43270). Jan 14 01:08:18.752180 kernel: audit: type=1130 audit(1768352898.747:789): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-10.0.21.32:22-5.187.35.21:43270 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:18.856026 kubelet[3316]: E0114 01:08:18.855994 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-vc7sq" podUID="95ab78ae-ca97-4cab-9490-03b0a50f740c" Jan 14 01:08:21.351432 sshd[5714]: Connection closed by authenticating user root 5.187.35.21 port 43270 [preauth] Jan 14 01:08:21.351000 audit[5714]: USER_ERR pid=5714 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=5.187.35.21 addr=5.187.35.21 terminal=ssh res=failed' Jan 14 01:08:21.352988 systemd[1]: sshd@19-10.0.21.32:22-5.187.35.21:43270.service: Deactivated successfully. Jan 14 01:08:21.353000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-10.0.21.32:22-5.187.35.21:43270 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:21.357389 kernel: audit: type=1109 audit(1768352901.351:790): pid=5714 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=5.187.35.21 addr=5.187.35.21 terminal=ssh res=failed' Jan 14 01:08:21.357441 kernel: audit: type=1131 audit(1768352901.353:791): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-10.0.21.32:22-5.187.35.21:43270 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:21.379000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-10.0.21.32:22-5.187.35.21:43280 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:21.379303 systemd[1]: Started sshd@20-10.0.21.32:22-5.187.35.21:43280.service - OpenSSH per-connection server daemon (5.187.35.21:43280). Jan 14 01:08:21.384069 kernel: audit: type=1130 audit(1768352901.379:792): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-10.0.21.32:22-5.187.35.21:43280 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:21.856819 kubelet[3316]: E0114 01:08:21.856766 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-d94f85b8d-frfrp" podUID="51002e26-f95b-49f1-8f48-be4a381935eb" Jan 14 01:08:21.857161 kubelet[3316]: E0114 01:08:21.856936 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-gdh9l" podUID="4395ad87-346f-47f3-8e06-f63944f13a5d" Jan 14 01:08:24.137373 sshd[5720]: Connection closed by authenticating user root 5.187.35.21 port 43280 [preauth] Jan 14 01:08:24.144303 kernel: audit: type=1109 audit(1768352904.137:793): pid=5720 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=5.187.35.21 addr=5.187.35.21 terminal=ssh res=failed' Jan 14 01:08:24.137000 audit[5720]: USER_ERR pid=5720 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=5.187.35.21 addr=5.187.35.21 terminal=ssh res=failed' Jan 14 01:08:24.141732 systemd[1]: sshd@20-10.0.21.32:22-5.187.35.21:43280.service: Deactivated successfully. Jan 14 01:08:24.141000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-10.0.21.32:22-5.187.35.21:43280 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:24.148068 kernel: audit: type=1131 audit(1768352904.141:794): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-10.0.21.32:22-5.187.35.21:43280 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:24.180000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-10.0.21.32:22-5.187.35.21:30930 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:24.180292 systemd[1]: Started sshd@21-10.0.21.32:22-5.187.35.21:30930.service - OpenSSH per-connection server daemon (5.187.35.21:30930). Jan 14 01:08:24.184074 kernel: audit: type=1130 audit(1768352904.180:795): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-10.0.21.32:22-5.187.35.21:30930 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:24.856282 kubelet[3316]: E0114 01:08:24.856159 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-694bbc95d6-42t9l" podUID="b7d50268-8797-458d-a912-f7456846c1f2" Jan 14 01:08:24.857241 kubelet[3316]: E0114 01:08:24.857209 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-86656bdf75-c9kjr" podUID="81b57aeb-9645-4cab-a7a2-931a98fd5ce6" Jan 14 01:08:27.551577 sshd[5727]: Connection closed by authenticating user root 5.187.35.21 port 30930 [preauth] Jan 14 01:08:27.552000 audit[5727]: USER_ERR pid=5727 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=5.187.35.21 addr=5.187.35.21 terminal=ssh res=failed' Jan 14 01:08:27.556094 kernel: audit: type=1109 audit(1768352907.552:796): pid=5727 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=5.187.35.21 addr=5.187.35.21 terminal=ssh res=failed' Jan 14 01:08:27.557453 systemd[1]: sshd@21-10.0.21.32:22-5.187.35.21:30930.service: Deactivated successfully. Jan 14 01:08:27.557000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-10.0.21.32:22-5.187.35.21:30930 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:27.562097 kernel: audit: type=1131 audit(1768352907.557:797): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-10.0.21.32:22-5.187.35.21:30930 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:27.581351 kernel: audit: type=1130 audit(1768352907.577:798): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-10.0.21.32:22-5.187.35.21:30956 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:27.577000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-10.0.21.32:22-5.187.35.21:30956 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:27.577180 systemd[1]: Started sshd@22-10.0.21.32:22-5.187.35.21:30956.service - OpenSSH per-connection server daemon (5.187.35.21:30956). Jan 14 01:08:27.856606 kubelet[3316]: E0114 01:08:27.856537 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-694bbc95d6-tk96l" podUID="f26be41d-3305-4b21-9d76-bde121cc2cce" Jan 14 01:08:30.465142 sshd[5736]: Connection closed by authenticating user root 5.187.35.21 port 30956 [preauth] Jan 14 01:08:30.465000 audit[5736]: USER_ERR pid=5736 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=5.187.35.21 addr=5.187.35.21 terminal=ssh res=failed' Jan 14 01:08:30.466843 systemd[1]: sshd@22-10.0.21.32:22-5.187.35.21:30956.service: Deactivated successfully. Jan 14 01:08:30.470070 kernel: audit: type=1109 audit(1768352910.465:799): pid=5736 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=5.187.35.21 addr=5.187.35.21 terminal=ssh res=failed' Jan 14 01:08:30.474326 kernel: audit: type=1131 audit(1768352910.467:800): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-10.0.21.32:22-5.187.35.21:30956 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:30.467000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-10.0.21.32:22-5.187.35.21:30956 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:30.497603 kernel: audit: type=1130 audit(1768352910.492:801): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-10.0.21.32:22-5.187.35.21:30972 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:30.492000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-10.0.21.32:22-5.187.35.21:30972 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:30.492141 systemd[1]: Started sshd@23-10.0.21.32:22-5.187.35.21:30972.service - OpenSSH per-connection server daemon (5.187.35.21:30972). Jan 14 01:08:33.039792 sshd[5748]: Connection closed by authenticating user root 5.187.35.21 port 30972 [preauth] Jan 14 01:08:33.039000 audit[5748]: USER_ERR pid=5748 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=5.187.35.21 addr=5.187.35.21 terminal=ssh res=failed' Jan 14 01:08:33.044079 kernel: audit: type=1109 audit(1768352913.039:802): pid=5748 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=5.187.35.21 addr=5.187.35.21 terminal=ssh res=failed' Jan 14 01:08:33.045025 systemd[1]: sshd@23-10.0.21.32:22-5.187.35.21:30972.service: Deactivated successfully. Jan 14 01:08:33.044000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-10.0.21.32:22-5.187.35.21:30972 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:33.049081 kernel: audit: type=1131 audit(1768352913.044:803): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-10.0.21.32:22-5.187.35.21:30972 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:33.097000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@24-10.0.21.32:22-5.187.35.21:50186 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:33.098135 systemd[1]: Started sshd@24-10.0.21.32:22-5.187.35.21:50186.service - OpenSSH per-connection server daemon (5.187.35.21:50186). Jan 14 01:08:33.102136 kernel: audit: type=1130 audit(1768352913.097:804): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@24-10.0.21.32:22-5.187.35.21:50186 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:33.858710 kubelet[3316]: E0114 01:08:33.858214 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-vc7sq" podUID="95ab78ae-ca97-4cab-9490-03b0a50f740c" Jan 14 01:08:33.859150 kubelet[3316]: E0114 01:08:33.858776 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-gdh9l" podUID="4395ad87-346f-47f3-8e06-f63944f13a5d" Jan 14 01:08:33.859150 kubelet[3316]: E0114 01:08:33.858846 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-d94f85b8d-frfrp" podUID="51002e26-f95b-49f1-8f48-be4a381935eb" Jan 14 01:08:36.527211 sshd[5756]: Connection closed by authenticating user root 5.187.35.21 port 50186 [preauth] Jan 14 01:08:36.527000 audit[5756]: USER_ERR pid=5756 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=5.187.35.21 addr=5.187.35.21 terminal=ssh res=failed' Jan 14 01:08:36.529896 systemd[1]: sshd@24-10.0.21.32:22-5.187.35.21:50186.service: Deactivated successfully. Jan 14 01:08:36.532425 kernel: audit: type=1109 audit(1768352916.527:805): pid=5756 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=5.187.35.21 addr=5.187.35.21 terminal=ssh res=failed' Jan 14 01:08:36.529000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@24-10.0.21.32:22-5.187.35.21:50186 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:36.536069 kernel: audit: type=1131 audit(1768352916.529:806): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@24-10.0.21.32:22-5.187.35.21:50186 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:36.556687 systemd[1]: Started sshd@25-10.0.21.32:22-5.187.35.21:50192.service - OpenSSH per-connection server daemon (5.187.35.21:50192). Jan 14 01:08:36.555000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@25-10.0.21.32:22-5.187.35.21:50192 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:36.561113 kernel: audit: type=1130 audit(1768352916.555:807): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@25-10.0.21.32:22-5.187.35.21:50192 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:38.891365 sshd[5762]: Invalid user Antminer from 5.187.35.21 port 50192 Jan 14 01:08:39.305574 sshd[5762]: Connection closed by invalid user Antminer 5.187.35.21 port 50192 [preauth] Jan 14 01:08:39.305000 audit[5762]: USER_ERR pid=5762 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=5.187.35.21 addr=5.187.35.21 terminal=ssh res=failed' Jan 14 01:08:39.308000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@25-10.0.21.32:22-5.187.35.21:50192 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:39.309463 systemd[1]: sshd@25-10.0.21.32:22-5.187.35.21:50192.service: Deactivated successfully. Jan 14 01:08:39.311152 kernel: audit: type=1109 audit(1768352919.305:808): pid=5762 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=5.187.35.21 addr=5.187.35.21 terminal=ssh res=failed' Jan 14 01:08:39.311233 kernel: audit: type=1131 audit(1768352919.308:809): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@25-10.0.21.32:22-5.187.35.21:50192 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:39.330630 systemd[1]: Started sshd@26-10.0.21.32:22-5.187.35.21:50212.service - OpenSSH per-connection server daemon (5.187.35.21:50212). Jan 14 01:08:39.329000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@26-10.0.21.32:22-5.187.35.21:50212 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:39.335068 kernel: audit: type=1130 audit(1768352919.329:810): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@26-10.0.21.32:22-5.187.35.21:50212 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:39.857019 kubelet[3316]: E0114 01:08:39.856233 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-86656bdf75-c9kjr" podUID="81b57aeb-9645-4cab-a7a2-931a98fd5ce6" Jan 14 01:08:39.857019 kubelet[3316]: E0114 01:08:39.856303 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-694bbc95d6-42t9l" podUID="b7d50268-8797-458d-a912-f7456846c1f2" Jan 14 01:08:41.709430 sshd[5768]: Invalid user Antminer from 5.187.35.21 port 50212 Jan 14 01:08:41.857535 kubelet[3316]: E0114 01:08:41.857499 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-694bbc95d6-tk96l" podUID="f26be41d-3305-4b21-9d76-bde121cc2cce" Jan 14 01:08:42.612371 sshd[5768]: Connection closed by invalid user Antminer 5.187.35.21 port 50212 [preauth] Jan 14 01:08:42.612000 audit[5768]: USER_ERR pid=5768 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=5.187.35.21 addr=5.187.35.21 terminal=ssh res=failed' Jan 14 01:08:42.614899 systemd[1]: sshd@26-10.0.21.32:22-5.187.35.21:50212.service: Deactivated successfully. Jan 14 01:08:42.617497 kernel: audit: type=1109 audit(1768352922.612:811): pid=5768 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=5.187.35.21 addr=5.187.35.21 terminal=ssh res=failed' Jan 14 01:08:42.614000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@26-10.0.21.32:22-5.187.35.21:50212 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:42.621099 kernel: audit: type=1131 audit(1768352922.614:812): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@26-10.0.21.32:22-5.187.35.21:50212 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:42.637502 systemd[1]: Started sshd@27-10.0.21.32:22-5.187.35.21:50318.service - OpenSSH per-connection server daemon (5.187.35.21:50318). Jan 14 01:08:42.636000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@27-10.0.21.32:22-5.187.35.21:50318 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:42.642109 kernel: audit: type=1130 audit(1768352922.636:813): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@27-10.0.21.32:22-5.187.35.21:50318 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:45.355546 sshd[5798]: Connection closed by authenticating user root 5.187.35.21 port 50318 [preauth] Jan 14 01:08:45.354000 audit[5798]: USER_ERR pid=5798 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=5.187.35.21 addr=5.187.35.21 terminal=ssh res=failed' Jan 14 01:08:45.359134 kernel: audit: type=1109 audit(1768352925.354:814): pid=5798 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=5.187.35.21 addr=5.187.35.21 terminal=ssh res=failed' Jan 14 01:08:45.364335 kernel: audit: type=1131 audit(1768352925.358:815): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@27-10.0.21.32:22-5.187.35.21:50318 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:45.358000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@27-10.0.21.32:22-5.187.35.21:50318 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:45.359287 systemd[1]: sshd@27-10.0.21.32:22-5.187.35.21:50318.service: Deactivated successfully. Jan 14 01:08:45.380785 systemd[1]: Started sshd@28-10.0.21.32:22-5.187.35.21:50372.service - OpenSSH per-connection server daemon (5.187.35.21:50372). Jan 14 01:08:45.379000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@28-10.0.21.32:22-5.187.35.21:50372 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:45.385064 kernel: audit: type=1130 audit(1768352925.379:816): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@28-10.0.21.32:22-5.187.35.21:50372 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:47.860723 kubelet[3316]: E0114 01:08:47.859719 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-vc7sq" podUID="95ab78ae-ca97-4cab-9490-03b0a50f740c" Jan 14 01:08:47.861756 kubelet[3316]: E0114 01:08:47.861653 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-gdh9l" podUID="4395ad87-346f-47f3-8e06-f63944f13a5d" Jan 14 01:08:48.252925 sshd[5805]: Connection closed by authenticating user root 5.187.35.21 port 50372 [preauth] Jan 14 01:08:48.258138 kernel: audit: type=1109 audit(1768352928.252:817): pid=5805 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=5.187.35.21 addr=5.187.35.21 terminal=ssh res=failed' Jan 14 01:08:48.252000 audit[5805]: USER_ERR pid=5805 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=5.187.35.21 addr=5.187.35.21 terminal=ssh res=failed' Jan 14 01:08:48.258857 systemd[1]: sshd@28-10.0.21.32:22-5.187.35.21:50372.service: Deactivated successfully. Jan 14 01:08:48.258000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@28-10.0.21.32:22-5.187.35.21:50372 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:48.268643 kernel: audit: type=1131 audit(1768352928.258:818): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@28-10.0.21.32:22-5.187.35.21:50372 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:48.285242 systemd[1]: Started sshd@29-10.0.21.32:22-5.187.35.21:50408.service - OpenSSH per-connection server daemon (5.187.35.21:50408). Jan 14 01:08:48.284000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@29-10.0.21.32:22-5.187.35.21:50408 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:48.290076 kernel: audit: type=1130 audit(1768352928.284:819): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@29-10.0.21.32:22-5.187.35.21:50408 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:48.856707 kubelet[3316]: E0114 01:08:48.856483 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-d94f85b8d-frfrp" podUID="51002e26-f95b-49f1-8f48-be4a381935eb" Jan 14 01:08:49.870956 containerd[1695]: time="2026-01-14T01:08:49.870884135Z" level=info msg="container event discarded" container=3c6a4a166e565db90dff8f49bc39c51104f6ab4c30ab86706e6d0af786397933 type=CONTAINER_CREATED_EVENT Jan 14 01:08:49.882421 containerd[1695]: time="2026-01-14T01:08:49.882365201Z" level=info msg="container event discarded" container=3c6a4a166e565db90dff8f49bc39c51104f6ab4c30ab86706e6d0af786397933 type=CONTAINER_STARTED_EVENT Jan 14 01:08:49.904692 containerd[1695]: time="2026-01-14T01:08:49.904638226Z" level=info msg="container event discarded" container=486f6e84de8abf16564b2e9ece5a1a1fcf44827b575bce934706ac75e82f4740 type=CONTAINER_CREATED_EVENT Jan 14 01:08:49.904692 containerd[1695]: time="2026-01-14T01:08:49.904681843Z" level=info msg="container event discarded" container=486f6e84de8abf16564b2e9ece5a1a1fcf44827b575bce934706ac75e82f4740 type=CONTAINER_STARTED_EVENT Jan 14 01:08:49.904692 containerd[1695]: time="2026-01-14T01:08:49.904689455Z" level=info msg="container event discarded" container=697c8fa0d6cd8e514cc5b789d61ed016a9cbdccfcde7e58f45d343a0305755ee type=CONTAINER_CREATED_EVENT Jan 14 01:08:49.904692 containerd[1695]: time="2026-01-14T01:08:49.904695684Z" level=info msg="container event discarded" container=697c8fa0d6cd8e514cc5b789d61ed016a9cbdccfcde7e58f45d343a0305755ee type=CONTAINER_STARTED_EVENT Jan 14 01:08:49.923910 containerd[1695]: time="2026-01-14T01:08:49.923861073Z" level=info msg="container event discarded" container=7616b4921adf26bfff55412882d6e7056afa24eda596ad1ec8f967c2e11215a9 type=CONTAINER_CREATED_EVENT Jan 14 01:08:49.937135 containerd[1695]: time="2026-01-14T01:08:49.937088801Z" level=info msg="container event discarded" container=2cf0ce44d3607542dcd9098ffb524edc9046a98eb65edf65a62064d187fc264f type=CONTAINER_CREATED_EVENT Jan 14 01:08:49.937135 containerd[1695]: time="2026-01-14T01:08:49.937126360Z" level=info msg="container event discarded" container=45f137af4154ccd9fc04c85cc2a517e9d57dbc0cfe10389b07cb49e6575117a8 type=CONTAINER_CREATED_EVENT Jan 14 01:08:50.026795 containerd[1695]: time="2026-01-14T01:08:50.026747518Z" level=info msg="container event discarded" container=7616b4921adf26bfff55412882d6e7056afa24eda596ad1ec8f967c2e11215a9 type=CONTAINER_STARTED_EVENT Jan 14 01:08:50.026795 containerd[1695]: time="2026-01-14T01:08:50.026786625Z" level=info msg="container event discarded" container=45f137af4154ccd9fc04c85cc2a517e9d57dbc0cfe10389b07cb49e6575117a8 type=CONTAINER_STARTED_EVENT Jan 14 01:08:50.094651 containerd[1695]: time="2026-01-14T01:08:50.094604247Z" level=info msg="container event discarded" container=2cf0ce44d3607542dcd9098ffb524edc9046a98eb65edf65a62064d187fc264f type=CONTAINER_STARTED_EVENT Jan 14 01:08:50.856370 kubelet[3316]: E0114 01:08:50.856328 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-86656bdf75-c9kjr" podUID="81b57aeb-9645-4cab-a7a2-931a98fd5ce6" Jan 14 01:08:51.467889 sshd[5811]: Connection closed by authenticating user root 5.187.35.21 port 50408 [preauth] Jan 14 01:08:51.473841 kernel: audit: type=1109 audit(1768352931.466:820): pid=5811 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=5.187.35.21 addr=5.187.35.21 terminal=ssh res=failed' Jan 14 01:08:51.466000 audit[5811]: USER_ERR pid=5811 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=5.187.35.21 addr=5.187.35.21 terminal=ssh res=failed' Jan 14 01:08:51.472083 systemd[1]: sshd@29-10.0.21.32:22-5.187.35.21:50408.service: Deactivated successfully. Jan 14 01:08:51.471000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@29-10.0.21.32:22-5.187.35.21:50408 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:51.480870 kernel: audit: type=1131 audit(1768352931.471:821): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@29-10.0.21.32:22-5.187.35.21:50408 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:51.499000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@30-10.0.21.32:22-5.187.35.21:50452 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:51.500363 systemd[1]: Started sshd@30-10.0.21.32:22-5.187.35.21:50452.service - OpenSSH per-connection server daemon (5.187.35.21:50452). Jan 14 01:08:51.505611 kernel: audit: type=1130 audit(1768352931.499:822): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@30-10.0.21.32:22-5.187.35.21:50452 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:53.857562 kubelet[3316]: E0114 01:08:53.857402 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-694bbc95d6-42t9l" podUID="b7d50268-8797-458d-a912-f7456846c1f2" Jan 14 01:08:53.859166 kubelet[3316]: E0114 01:08:53.859110 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-694bbc95d6-tk96l" podUID="f26be41d-3305-4b21-9d76-bde121cc2cce" Jan 14 01:08:54.339124 sshd[5817]: Connection closed by authenticating user root 5.187.35.21 port 50452 [preauth] Jan 14 01:08:54.338000 audit[5817]: USER_ERR pid=5817 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=5.187.35.21 addr=5.187.35.21 terminal=ssh res=failed' Jan 14 01:08:54.342523 systemd[1]: sshd@30-10.0.21.32:22-5.187.35.21:50452.service: Deactivated successfully. Jan 14 01:08:54.344088 kernel: audit: type=1109 audit(1768352934.338:823): pid=5817 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=5.187.35.21 addr=5.187.35.21 terminal=ssh res=failed' Jan 14 01:08:54.341000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@30-10.0.21.32:22-5.187.35.21:50452 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:54.351060 kernel: audit: type=1131 audit(1768352934.341:824): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@30-10.0.21.32:22-5.187.35.21:50452 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:54.367000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@31-10.0.21.32:22-5.187.35.21:19710 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:54.368290 systemd[1]: Started sshd@31-10.0.21.32:22-5.187.35.21:19710.service - OpenSSH per-connection server daemon (5.187.35.21:19710). Jan 14 01:08:54.372144 kernel: audit: type=1130 audit(1768352934.367:825): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@31-10.0.21.32:22-5.187.35.21:19710 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:57.047452 sshd[5825]: Connection closed by authenticating user root 5.187.35.21 port 19710 [preauth] Jan 14 01:08:57.046000 audit[5825]: USER_ERR pid=5825 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=5.187.35.21 addr=5.187.35.21 terminal=ssh res=failed' Jan 14 01:08:57.052116 kernel: audit: type=1109 audit(1768352937.046:826): pid=5825 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=5.187.35.21 addr=5.187.35.21 terminal=ssh res=failed' Jan 14 01:08:57.052314 systemd[1]: sshd@31-10.0.21.32:22-5.187.35.21:19710.service: Deactivated successfully. Jan 14 01:08:57.051000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@31-10.0.21.32:22-5.187.35.21:19710 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:57.057076 kernel: audit: type=1131 audit(1768352937.051:827): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@31-10.0.21.32:22-5.187.35.21:19710 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:57.071686 systemd[1]: Started sshd@32-10.0.21.32:22-5.187.35.21:19732.service - OpenSSH per-connection server daemon (5.187.35.21:19732). Jan 14 01:08:57.070000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@32-10.0.21.32:22-5.187.35.21:19732 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:57.076176 kernel: audit: type=1130 audit(1768352937.070:828): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@32-10.0.21.32:22-5.187.35.21:19732 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:08:59.307760 sshd[5831]: Invalid user admin from 5.187.35.21 port 19732 Jan 14 01:09:00.286158 sshd[5831]: Connection closed by invalid user admin 5.187.35.21 port 19732 [preauth] Jan 14 01:09:00.285000 audit[5831]: USER_ERR pid=5831 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=5.187.35.21 addr=5.187.35.21 terminal=ssh res=failed' Jan 14 01:09:00.288392 systemd[1]: sshd@32-10.0.21.32:22-5.187.35.21:19732.service: Deactivated successfully. Jan 14 01:09:00.291210 kernel: audit: type=1109 audit(1768352940.285:829): pid=5831 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=5.187.35.21 addr=5.187.35.21 terminal=ssh res=failed' Jan 14 01:09:00.291263 kernel: audit: type=1131 audit(1768352940.288:830): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@32-10.0.21.32:22-5.187.35.21:19732 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:09:00.288000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@32-10.0.21.32:22-5.187.35.21:19732 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:09:00.320584 systemd[1]: Started sshd@33-10.0.21.32:22-5.187.35.21:19760.service - OpenSSH per-connection server daemon (5.187.35.21:19760). Jan 14 01:09:00.319000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@33-10.0.21.32:22-5.187.35.21:19760 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:09:00.325086 kernel: audit: type=1130 audit(1768352940.319:831): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@33-10.0.21.32:22-5.187.35.21:19760 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:09:00.480738 containerd[1695]: time="2026-01-14T01:09:00.480691550Z" level=info msg="container event discarded" container=337f729c27e19af09305fe11891301cb8a04864a6d943a017f216e4e4aa75f19 type=CONTAINER_CREATED_EVENT Jan 14 01:09:00.481412 containerd[1695]: time="2026-01-14T01:09:00.481269408Z" level=info msg="container event discarded" container=337f729c27e19af09305fe11891301cb8a04864a6d943a017f216e4e4aa75f19 type=CONTAINER_STARTED_EVENT Jan 14 01:09:00.481412 containerd[1695]: time="2026-01-14T01:09:00.481286737Z" level=info msg="container event discarded" container=489a965a8afd48451e1fbe8bf6c775b11a4c5eda5ffd2ae111d0c66a7030ea46 type=CONTAINER_CREATED_EVENT Jan 14 01:09:00.481412 containerd[1695]: time="2026-01-14T01:09:00.481294544Z" level=info msg="container event discarded" container=489a965a8afd48451e1fbe8bf6c775b11a4c5eda5ffd2ae111d0c66a7030ea46 type=CONTAINER_STARTED_EVENT Jan 14 01:09:00.512503 containerd[1695]: time="2026-01-14T01:09:00.512445103Z" level=info msg="container event discarded" container=5a874574a1e8700d55742af43090a7d89478dcf2578a6370f44842ab6baa1136 type=CONTAINER_CREATED_EVENT Jan 14 01:09:00.610881 containerd[1695]: time="2026-01-14T01:09:00.610834760Z" level=info msg="container event discarded" container=5a874574a1e8700d55742af43090a7d89478dcf2578a6370f44842ab6baa1136 type=CONTAINER_STARTED_EVENT Jan 14 01:09:00.856183 kubelet[3316]: E0114 01:09:00.856120 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-vc7sq" podUID="95ab78ae-ca97-4cab-9490-03b0a50f740c" Jan 14 01:09:02.702924 sshd[5837]: Invalid user baikal from 5.187.35.21 port 19760 Jan 14 01:09:02.859105 kubelet[3316]: E0114 01:09:02.857700 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-gdh9l" podUID="4395ad87-346f-47f3-8e06-f63944f13a5d" Jan 14 01:09:02.859589 kubelet[3316]: E0114 01:09:02.859542 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-d94f85b8d-frfrp" podUID="51002e26-f95b-49f1-8f48-be4a381935eb" Jan 14 01:09:02.894563 containerd[1695]: time="2026-01-14T01:09:02.894505450Z" level=info msg="container event discarded" container=2b216497486f9bd0d6a49a0af7a082b924c0ad66cf99ae5b86d17b3bb20e65cc type=CONTAINER_CREATED_EVENT Jan 14 01:09:02.939715 containerd[1695]: time="2026-01-14T01:09:02.939653624Z" level=info msg="container event discarded" container=2b216497486f9bd0d6a49a0af7a082b924c0ad66cf99ae5b86d17b3bb20e65cc type=CONTAINER_STARTED_EVENT Jan 14 01:09:03.146771 sshd[5837]: Connection closed by invalid user baikal 5.187.35.21 port 19760 [preauth] Jan 14 01:09:03.146000 audit[5837]: USER_ERR pid=5837 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=5.187.35.21 addr=5.187.35.21 terminal=ssh res=failed' Jan 14 01:09:03.150376 systemd[1]: sshd@33-10.0.21.32:22-5.187.35.21:19760.service: Deactivated successfully. Jan 14 01:09:03.149000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@33-10.0.21.32:22-5.187.35.21:19760 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:09:03.152220 kernel: audit: type=1109 audit(1768352943.146:832): pid=5837 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=5.187.35.21 addr=5.187.35.21 terminal=ssh res=failed' Jan 14 01:09:03.152280 kernel: audit: type=1131 audit(1768352943.149:833): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@33-10.0.21.32:22-5.187.35.21:19760 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:09:03.855918 kubelet[3316]: E0114 01:09:03.855870 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-86656bdf75-c9kjr" podUID="81b57aeb-9645-4cab-a7a2-931a98fd5ce6" Jan 14 01:09:04.855852 kubelet[3316]: E0114 01:09:04.855572 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-694bbc95d6-tk96l" podUID="f26be41d-3305-4b21-9d76-bde121cc2cce" Jan 14 01:09:05.857303 kubelet[3316]: E0114 01:09:05.857273 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-694bbc95d6-42t9l" podUID="b7d50268-8797-458d-a912-f7456846c1f2" Jan 14 01:09:13.151430 containerd[1695]: time="2026-01-14T01:09:13.151354197Z" level=info msg="container event discarded" container=8f8b8520b795e6072f24022aa942189bc4b11d9b8b262760089244acd6c6d6ab type=CONTAINER_CREATED_EVENT Jan 14 01:09:13.151895 containerd[1695]: time="2026-01-14T01:09:13.151868271Z" level=info msg="container event discarded" container=8f8b8520b795e6072f24022aa942189bc4b11d9b8b262760089244acd6c6d6ab type=CONTAINER_STARTED_EVENT Jan 14 01:09:13.266114 containerd[1695]: time="2026-01-14T01:09:13.265999813Z" level=info msg="container event discarded" container=b08cd74b7cd1e5e9af53f88d44fdfd4a375fd24d3f7460322c2d5f450bf0a431 type=CONTAINER_CREATED_EVENT Jan 14 01:09:13.266114 containerd[1695]: time="2026-01-14T01:09:13.266101173Z" level=info msg="container event discarded" container=b08cd74b7cd1e5e9af53f88d44fdfd4a375fd24d3f7460322c2d5f450bf0a431 type=CONTAINER_STARTED_EVENT Jan 14 01:09:14.856246 kubelet[3316]: E0114 01:09:14.856162 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-d94f85b8d-frfrp" podUID="51002e26-f95b-49f1-8f48-be4a381935eb" Jan 14 01:09:14.856919 kubelet[3316]: E0114 01:09:14.856729 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-gdh9l" podUID="4395ad87-346f-47f3-8e06-f63944f13a5d" Jan 14 01:09:15.599286 containerd[1695]: time="2026-01-14T01:09:15.599234570Z" level=info msg="container event discarded" container=02cf34fad0271e8b0ed026836983bd5651d45ec0470110f4e174171b38e1308f type=CONTAINER_CREATED_EVENT Jan 14 01:09:15.684031 containerd[1695]: time="2026-01-14T01:09:15.683981187Z" level=info msg="container event discarded" container=02cf34fad0271e8b0ed026836983bd5651d45ec0470110f4e174171b38e1308f type=CONTAINER_STARTED_EVENT Jan 14 01:09:15.856502 kubelet[3316]: E0114 01:09:15.856414 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-694bbc95d6-tk96l" podUID="f26be41d-3305-4b21-9d76-bde121cc2cce" Jan 14 01:09:15.857112 kubelet[3316]: E0114 01:09:15.856820 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-vc7sq" podUID="95ab78ae-ca97-4cab-9490-03b0a50f740c" Jan 14 01:09:17.119446 containerd[1695]: time="2026-01-14T01:09:17.119391732Z" level=info msg="container event discarded" container=13ae228fc04e2b181aa1d18a958926e1f90923713d9d416649ba90c146095321 type=CONTAINER_CREATED_EVENT Jan 14 01:09:17.214655 containerd[1695]: time="2026-01-14T01:09:17.214593898Z" level=info msg="container event discarded" container=13ae228fc04e2b181aa1d18a958926e1f90923713d9d416649ba90c146095321 type=CONTAINER_STARTED_EVENT Jan 14 01:09:17.861073 kubelet[3316]: E0114 01:09:17.860038 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-694bbc95d6-42t9l" podUID="b7d50268-8797-458d-a912-f7456846c1f2" Jan 14 01:09:17.861790 kubelet[3316]: E0114 01:09:17.861726 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-86656bdf75-c9kjr" podUID="81b57aeb-9645-4cab-a7a2-931a98fd5ce6" Jan 14 01:09:18.783393 containerd[1695]: time="2026-01-14T01:09:18.783238049Z" level=info msg="container event discarded" container=13ae228fc04e2b181aa1d18a958926e1f90923713d9d416649ba90c146095321 type=CONTAINER_STOPPED_EVENT Jan 14 01:09:19.834700 systemd[2265]: Created slice background.slice - User Background Tasks Slice. Jan 14 01:09:19.836495 systemd[2265]: Starting systemd-tmpfiles-clean.service - Cleanup of User's Temporary Files and Directories... Jan 14 01:09:19.857845 systemd[2265]: Finished systemd-tmpfiles-clean.service - Cleanup of User's Temporary Files and Directories. Jan 14 01:09:22.789755 containerd[1695]: time="2026-01-14T01:09:22.789694131Z" level=info msg="container event discarded" container=c4a5b809b26841c945c90094d5df088b3b7f98b6736cdbc81c7840d4cc9cc38a type=CONTAINER_CREATED_EVENT Jan 14 01:09:22.910140 containerd[1695]: time="2026-01-14T01:09:22.910088968Z" level=info msg="container event discarded" container=c4a5b809b26841c945c90094d5df088b3b7f98b6736cdbc81c7840d4cc9cc38a type=CONTAINER_STARTED_EVENT Jan 14 01:09:25.741133 containerd[1695]: time="2026-01-14T01:09:25.741085135Z" level=info msg="container event discarded" container=c4a5b809b26841c945c90094d5df088b3b7f98b6736cdbc81c7840d4cc9cc38a type=CONTAINER_STOPPED_EVENT Jan 14 01:09:26.855673 kubelet[3316]: E0114 01:09:26.855499 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-d94f85b8d-frfrp" podUID="51002e26-f95b-49f1-8f48-be4a381935eb" Jan 14 01:09:27.856170 kubelet[3316]: E0114 01:09:27.856040 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-694bbc95d6-tk96l" podUID="f26be41d-3305-4b21-9d76-bde121cc2cce" Jan 14 01:09:29.856508 kubelet[3316]: E0114 01:09:29.856453 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-694bbc95d6-42t9l" podUID="b7d50268-8797-458d-a912-f7456846c1f2" Jan 14 01:09:29.856508 kubelet[3316]: E0114 01:09:29.856462 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-vc7sq" podUID="95ab78ae-ca97-4cab-9490-03b0a50f740c" Jan 14 01:09:29.857290 kubelet[3316]: E0114 01:09:29.856981 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-gdh9l" podUID="4395ad87-346f-47f3-8e06-f63944f13a5d" Jan 14 01:09:32.855884 kubelet[3316]: E0114 01:09:32.855798 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-86656bdf75-c9kjr" podUID="81b57aeb-9645-4cab-a7a2-931a98fd5ce6" Jan 14 01:09:33.782345 containerd[1695]: time="2026-01-14T01:09:33.782285978Z" level=info msg="container event discarded" container=184f17c5c82331a1ebb3e9ae10d0457db589802799c52461d032e47c08b25968 type=CONTAINER_CREATED_EVENT Jan 14 01:09:33.922788 containerd[1695]: time="2026-01-14T01:09:33.922724154Z" level=info msg="container event discarded" container=184f17c5c82331a1ebb3e9ae10d0457db589802799c52461d032e47c08b25968 type=CONTAINER_STARTED_EVENT Jan 14 01:09:36.624623 containerd[1695]: time="2026-01-14T01:09:36.624571224Z" level=info msg="container event discarded" container=20bbac06aaaf81fccf636c1eaf2ed53d8786ed39b748ce5a6c809a0d07f74f21 type=CONTAINER_CREATED_EVENT Jan 14 01:09:36.625107 containerd[1695]: time="2026-01-14T01:09:36.624972329Z" level=info msg="container event discarded" container=20bbac06aaaf81fccf636c1eaf2ed53d8786ed39b748ce5a6c809a0d07f74f21 type=CONTAINER_STARTED_EVENT Jan 14 01:09:37.091839 containerd[1695]: time="2026-01-14T01:09:37.091787607Z" level=info msg="container event discarded" container=ef08dc1e97cf380a7b172e3df2567cf6aeabb782f1eb83e29e1c08a579d4ae0b type=CONTAINER_CREATED_EVENT Jan 14 01:09:37.091839 containerd[1695]: time="2026-01-14T01:09:37.091832090Z" level=info msg="container event discarded" container=ef08dc1e97cf380a7b172e3df2567cf6aeabb782f1eb83e29e1c08a579d4ae0b type=CONTAINER_STARTED_EVENT Jan 14 01:09:38.105199 containerd[1695]: time="2026-01-14T01:09:38.105143597Z" level=info msg="container event discarded" container=3bba22cc51f68d20b1a3cc26c10df42d95737bbc11f0cedc06c46e6c2da9d3ba type=CONTAINER_CREATED_EVENT Jan 14 01:09:38.105637 containerd[1695]: time="2026-01-14T01:09:38.105182792Z" level=info msg="container event discarded" container=3bba22cc51f68d20b1a3cc26c10df42d95737bbc11f0cedc06c46e6c2da9d3ba type=CONTAINER_STARTED_EVENT Jan 14 01:09:38.221925 containerd[1695]: time="2026-01-14T01:09:38.221858889Z" level=info msg="container event discarded" container=fb2000b71b00dc42be88efcddee2b9cc3d02a3e37b7619872ab05122c2f67dc9 type=CONTAINER_CREATED_EVENT Jan 14 01:09:38.222146 containerd[1695]: time="2026-01-14T01:09:38.222124096Z" level=info msg="container event discarded" container=fb2000b71b00dc42be88efcddee2b9cc3d02a3e37b7619872ab05122c2f67dc9 type=CONTAINER_STARTED_EVENT Jan 14 01:09:38.261246 containerd[1695]: time="2026-01-14T01:09:38.261190921Z" level=info msg="container event discarded" container=37ea5ba7f26074172f69479388d79b76fce8ac916e6b72b62dc5d43aea2a04dd type=CONTAINER_CREATED_EVENT Jan 14 01:09:38.301656 containerd[1695]: time="2026-01-14T01:09:38.301601200Z" level=info msg="container event discarded" container=37ea5ba7f26074172f69479388d79b76fce8ac916e6b72b62dc5d43aea2a04dd type=CONTAINER_STARTED_EVENT Jan 14 01:09:38.690505 systemd[1]: cri-containerd-45f137af4154ccd9fc04c85cc2a517e9d57dbc0cfe10389b07cb49e6575117a8.scope: Deactivated successfully. Jan 14 01:09:38.691967 systemd[1]: cri-containerd-45f137af4154ccd9fc04c85cc2a517e9d57dbc0cfe10389b07cb49e6575117a8.scope: Consumed 3.525s CPU time, 61.6M memory peak, 84K read from disk. Jan 14 01:09:38.691000 audit: BPF prog-id=260 op=LOAD Jan 14 01:09:38.695112 kernel: audit: type=1334 audit(1768352978.691:834): prog-id=260 op=LOAD Jan 14 01:09:38.695271 containerd[1695]: time="2026-01-14T01:09:38.695243160Z" level=info msg="received container exit event container_id:\"45f137af4154ccd9fc04c85cc2a517e9d57dbc0cfe10389b07cb49e6575117a8\" id:\"45f137af4154ccd9fc04c85cc2a517e9d57dbc0cfe10389b07cb49e6575117a8\" pid:3162 exit_status:1 exited_at:{seconds:1768352978 nanos:691551983}" Jan 14 01:09:38.691000 audit: BPF prog-id=97 op=UNLOAD Jan 14 01:09:38.697059 kernel: audit: type=1334 audit(1768352978.691:835): prog-id=97 op=UNLOAD Jan 14 01:09:38.697000 audit: BPF prog-id=107 op=UNLOAD Jan 14 01:09:38.697000 audit: BPF prog-id=111 op=UNLOAD Jan 14 01:09:38.700081 kernel: audit: type=1334 audit(1768352978.697:836): prog-id=107 op=UNLOAD Jan 14 01:09:38.700115 kernel: audit: type=1334 audit(1768352978.697:837): prog-id=111 op=UNLOAD Jan 14 01:09:38.719231 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-45f137af4154ccd9fc04c85cc2a517e9d57dbc0cfe10389b07cb49e6575117a8-rootfs.mount: Deactivated successfully. Jan 14 01:09:38.900974 kubelet[3316]: E0114 01:09:38.900943 3316 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.21.32:58360->10.0.21.96:2379: read: connection timed out" Jan 14 01:09:39.668650 systemd[1]: cri-containerd-2b216497486f9bd0d6a49a0af7a082b924c0ad66cf99ae5b86d17b3bb20e65cc.scope: Deactivated successfully. Jan 14 01:09:39.669448 systemd[1]: cri-containerd-2b216497486f9bd0d6a49a0af7a082b924c0ad66cf99ae5b86d17b3bb20e65cc.scope: Consumed 39.488s CPU time, 114.1M memory peak. Jan 14 01:09:39.671609 containerd[1695]: time="2026-01-14T01:09:39.671547055Z" level=info msg="received container exit event container_id:\"2b216497486f9bd0d6a49a0af7a082b924c0ad66cf99ae5b86d17b3bb20e65cc\" id:\"2b216497486f9bd0d6a49a0af7a082b924c0ad66cf99ae5b86d17b3bb20e65cc\" pid:3637 exit_status:1 exited_at:{seconds:1768352979 nanos:671009677}" Jan 14 01:09:39.671000 audit: BPF prog-id=150 op=UNLOAD Jan 14 01:09:39.671000 audit: BPF prog-id=154 op=UNLOAD Jan 14 01:09:39.674458 kernel: audit: type=1334 audit(1768352979.671:838): prog-id=150 op=UNLOAD Jan 14 01:09:39.674499 kernel: audit: type=1334 audit(1768352979.671:839): prog-id=154 op=UNLOAD Jan 14 01:09:39.692334 kubelet[3316]: I0114 01:09:39.692312 3316 scope.go:117] "RemoveContainer" containerID="45f137af4154ccd9fc04c85cc2a517e9d57dbc0cfe10389b07cb49e6575117a8" Jan 14 01:09:39.694555 containerd[1695]: time="2026-01-14T01:09:39.694525943Z" level=info msg="CreateContainer within sandbox \"697c8fa0d6cd8e514cc5b789d61ed016a9cbdccfcde7e58f45d343a0305755ee\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Jan 14 01:09:39.701329 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-2b216497486f9bd0d6a49a0af7a082b924c0ad66cf99ae5b86d17b3bb20e65cc-rootfs.mount: Deactivated successfully. Jan 14 01:09:39.716584 containerd[1695]: time="2026-01-14T01:09:39.716551064Z" level=info msg="Container d738a40736e8e7ba375e0440073735ee25f35256c47cffdcc96c7beebaf87583: CDI devices from CRI Config.CDIDevices: []" Jan 14 01:09:39.733280 containerd[1695]: time="2026-01-14T01:09:39.733179183Z" level=info msg="CreateContainer within sandbox \"697c8fa0d6cd8e514cc5b789d61ed016a9cbdccfcde7e58f45d343a0305755ee\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"d738a40736e8e7ba375e0440073735ee25f35256c47cffdcc96c7beebaf87583\"" Jan 14 01:09:39.733648 containerd[1695]: time="2026-01-14T01:09:39.733630619Z" level=info msg="StartContainer for \"d738a40736e8e7ba375e0440073735ee25f35256c47cffdcc96c7beebaf87583\"" Jan 14 01:09:39.734604 containerd[1695]: time="2026-01-14T01:09:39.734530810Z" level=info msg="connecting to shim d738a40736e8e7ba375e0440073735ee25f35256c47cffdcc96c7beebaf87583" address="unix:///run/containerd/s/8a18ef3b0d6599cb82c81ae7085239f9a2bf226ef3c22999d5a97c6fd8ac016e" protocol=ttrpc version=3 Jan 14 01:09:39.756265 systemd[1]: Started cri-containerd-d738a40736e8e7ba375e0440073735ee25f35256c47cffdcc96c7beebaf87583.scope - libcontainer container d738a40736e8e7ba375e0440073735ee25f35256c47cffdcc96c7beebaf87583. Jan 14 01:09:39.768000 audit: BPF prog-id=261 op=LOAD Jan 14 01:09:39.768000 audit: BPF prog-id=262 op=LOAD Jan 14 01:09:39.772137 kernel: audit: type=1334 audit(1768352979.768:840): prog-id=261 op=LOAD Jan 14 01:09:39.772192 kernel: audit: type=1334 audit(1768352979.768:841): prog-id=262 op=LOAD Jan 14 01:09:39.768000 audit[5919]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=3035 pid=5919 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:09:39.778299 kernel: audit: type=1300 audit(1768352979.768:841): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=3035 pid=5919 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:09:39.768000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6437333861343037333665386537626133373565303434303037333733 Jan 14 01:09:39.768000 audit: BPF prog-id=262 op=UNLOAD Jan 14 01:09:39.783110 kernel: audit: type=1327 audit(1768352979.768:841): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6437333861343037333665386537626133373565303434303037333733 Jan 14 01:09:39.768000 audit[5919]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3035 pid=5919 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:09:39.768000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6437333861343037333665386537626133373565303434303037333733 Jan 14 01:09:39.768000 audit: BPF prog-id=263 op=LOAD Jan 14 01:09:39.768000 audit[5919]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=3035 pid=5919 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:09:39.768000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6437333861343037333665386537626133373565303434303037333733 Jan 14 01:09:39.768000 audit: BPF prog-id=264 op=LOAD Jan 14 01:09:39.768000 audit[5919]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=3035 pid=5919 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:09:39.768000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6437333861343037333665386537626133373565303434303037333733 Jan 14 01:09:39.768000 audit: BPF prog-id=264 op=UNLOAD Jan 14 01:09:39.768000 audit[5919]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3035 pid=5919 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:09:39.768000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6437333861343037333665386537626133373565303434303037333733 Jan 14 01:09:39.768000 audit: BPF prog-id=263 op=UNLOAD Jan 14 01:09:39.768000 audit[5919]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3035 pid=5919 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:09:39.768000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6437333861343037333665386537626133373565303434303037333733 Jan 14 01:09:39.768000 audit: BPF prog-id=265 op=LOAD Jan 14 01:09:39.768000 audit[5919]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=3035 pid=5919 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:09:39.768000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6437333861343037333665386537626133373565303434303037333733 Jan 14 01:09:39.811784 containerd[1695]: time="2026-01-14T01:09:39.811753560Z" level=info msg="StartContainer for \"d738a40736e8e7ba375e0440073735ee25f35256c47cffdcc96c7beebaf87583\" returns successfully" Jan 14 01:09:39.859747 kubelet[3316]: E0114 01:09:39.859710 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-694bbc95d6-tk96l" podUID="f26be41d-3305-4b21-9d76-bde121cc2cce" Jan 14 01:09:39.860065 kubelet[3316]: E0114 01:09:39.860022 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-d94f85b8d-frfrp" podUID="51002e26-f95b-49f1-8f48-be4a381935eb" Jan 14 01:09:40.137723 containerd[1695]: time="2026-01-14T01:09:40.137665581Z" level=info msg="container event discarded" container=26a9c1bb5fb0ccf136e278e6817ec21be039038d97053415ccdba63eab599718 type=CONTAINER_CREATED_EVENT Jan 14 01:09:40.137723 containerd[1695]: time="2026-01-14T01:09:40.137715427Z" level=info msg="container event discarded" container=26a9c1bb5fb0ccf136e278e6817ec21be039038d97053415ccdba63eab599718 type=CONTAINER_STARTED_EVENT Jan 14 01:09:40.176972 containerd[1695]: time="2026-01-14T01:09:40.176909902Z" level=info msg="container event discarded" container=20b1df70efe84c6f53d181488017682733d715e52904017e5dc38cce81ee1074 type=CONTAINER_CREATED_EVENT Jan 14 01:09:40.243199 containerd[1695]: time="2026-01-14T01:09:40.243143749Z" level=info msg="container event discarded" container=20b1df70efe84c6f53d181488017682733d715e52904017e5dc38cce81ee1074 type=CONTAINER_STARTED_EVENT Jan 14 01:09:40.273426 containerd[1695]: time="2026-01-14T01:09:40.273383283Z" level=info msg="container event discarded" container=9efcddebf2bffb15fb476b3192bfdddd92c2af61191b4d5a0ca3ee761d77d10e type=CONTAINER_CREATED_EVENT Jan 14 01:09:40.273426 containerd[1695]: time="2026-01-14T01:09:40.273415223Z" level=info msg="container event discarded" container=9efcddebf2bffb15fb476b3192bfdddd92c2af61191b4d5a0ca3ee761d77d10e type=CONTAINER_STARTED_EVENT Jan 14 01:09:40.695457 kubelet[3316]: I0114 01:09:40.695286 3316 scope.go:117] "RemoveContainer" containerID="2b216497486f9bd0d6a49a0af7a082b924c0ad66cf99ae5b86d17b3bb20e65cc" Jan 14 01:09:40.698118 containerd[1695]: time="2026-01-14T01:09:40.697257234Z" level=info msg="CreateContainer within sandbox \"337f729c27e19af09305fe11891301cb8a04864a6d943a017f216e4e4aa75f19\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Jan 14 01:09:40.710938 containerd[1695]: time="2026-01-14T01:09:40.710908899Z" level=info msg="Container 42dedede1d5198ea892d465abfdbbc9ea8eb516d870ee6a516bd1d1e86f32320: CDI devices from CRI Config.CDIDevices: []" Jan 14 01:09:40.717818 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3961968558.mount: Deactivated successfully. Jan 14 01:09:40.721881 containerd[1695]: time="2026-01-14T01:09:40.721239700Z" level=info msg="CreateContainer within sandbox \"337f729c27e19af09305fe11891301cb8a04864a6d943a017f216e4e4aa75f19\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"42dedede1d5198ea892d465abfdbbc9ea8eb516d870ee6a516bd1d1e86f32320\"" Jan 14 01:09:40.721881 containerd[1695]: time="2026-01-14T01:09:40.721649601Z" level=info msg="StartContainer for \"42dedede1d5198ea892d465abfdbbc9ea8eb516d870ee6a516bd1d1e86f32320\"" Jan 14 01:09:40.722670 containerd[1695]: time="2026-01-14T01:09:40.722637764Z" level=info msg="connecting to shim 42dedede1d5198ea892d465abfdbbc9ea8eb516d870ee6a516bd1d1e86f32320" address="unix:///run/containerd/s/d2f20eec08247b32b799b7380dd6c7427615b3320d2cba5f839f7f36d4c46bf4" protocol=ttrpc version=3 Jan 14 01:09:40.745230 systemd[1]: Started cri-containerd-42dedede1d5198ea892d465abfdbbc9ea8eb516d870ee6a516bd1d1e86f32320.scope - libcontainer container 42dedede1d5198ea892d465abfdbbc9ea8eb516d870ee6a516bd1d1e86f32320. Jan 14 01:09:40.755000 audit: BPF prog-id=266 op=LOAD Jan 14 01:09:40.755000 audit: BPF prog-id=267 op=LOAD Jan 14 01:09:40.755000 audit[5951]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00010c238 a2=98 a3=0 items=0 ppid=3373 pid=5951 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:09:40.755000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3432646564656465316435313938656138393264343635616266646262 Jan 14 01:09:40.755000 audit: BPF prog-id=267 op=UNLOAD Jan 14 01:09:40.755000 audit[5951]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3373 pid=5951 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:09:40.755000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3432646564656465316435313938656138393264343635616266646262 Jan 14 01:09:40.755000 audit: BPF prog-id=268 op=LOAD Jan 14 01:09:40.755000 audit[5951]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00010c488 a2=98 a3=0 items=0 ppid=3373 pid=5951 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:09:40.755000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3432646564656465316435313938656138393264343635616266646262 Jan 14 01:09:40.755000 audit: BPF prog-id=269 op=LOAD Jan 14 01:09:40.755000 audit[5951]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00010c218 a2=98 a3=0 items=0 ppid=3373 pid=5951 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:09:40.755000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3432646564656465316435313938656138393264343635616266646262 Jan 14 01:09:40.755000 audit: BPF prog-id=269 op=UNLOAD Jan 14 01:09:40.755000 audit[5951]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3373 pid=5951 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:09:40.755000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3432646564656465316435313938656138393264343635616266646262 Jan 14 01:09:40.755000 audit: BPF prog-id=268 op=UNLOAD Jan 14 01:09:40.755000 audit[5951]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3373 pid=5951 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:09:40.755000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3432646564656465316435313938656138393264343635616266646262 Jan 14 01:09:40.755000 audit: BPF prog-id=270 op=LOAD Jan 14 01:09:40.755000 audit[5951]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00010c6e8 a2=98 a3=0 items=0 ppid=3373 pid=5951 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:09:40.755000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3432646564656465316435313938656138393264343635616266646262 Jan 14 01:09:40.781162 containerd[1695]: time="2026-01-14T01:09:40.781088157Z" level=info msg="StartContainer for \"42dedede1d5198ea892d465abfdbbc9ea8eb516d870ee6a516bd1d1e86f32320\" returns successfully" Jan 14 01:09:40.856069 kubelet[3316]: E0114 01:09:40.855566 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-694bbc95d6-42t9l" podUID="b7d50268-8797-458d-a912-f7456846c1f2" Jan 14 01:09:40.856389 kubelet[3316]: E0114 01:09:40.856366 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-gdh9l" podUID="4395ad87-346f-47f3-8e06-f63944f13a5d" Jan 14 01:09:41.205571 containerd[1695]: time="2026-01-14T01:09:41.205509717Z" level=info msg="container event discarded" container=4abdad7e18e2b0190c1029b0667456e2ad190a173aa2e4690636e761c1cd40a4 type=CONTAINER_CREATED_EVENT Jan 14 01:09:41.205571 containerd[1695]: time="2026-01-14T01:09:41.205546187Z" level=info msg="container event discarded" container=4abdad7e18e2b0190c1029b0667456e2ad190a173aa2e4690636e761c1cd40a4 type=CONTAINER_STARTED_EVENT Jan 14 01:09:41.299908 containerd[1695]: time="2026-01-14T01:09:41.299839848Z" level=info msg="container event discarded" container=e2c2355c20ab5fcd06673247dc43026ffc5247ef74592f46e90759e51278cea3 type=CONTAINER_CREATED_EVENT Jan 14 01:09:41.299908 containerd[1695]: time="2026-01-14T01:09:41.299886319Z" level=info msg="container event discarded" container=e2c2355c20ab5fcd06673247dc43026ffc5247ef74592f46e90759e51278cea3 type=CONTAINER_STARTED_EVENT Jan 14 01:09:41.857311 kubelet[3316]: E0114 01:09:41.857268 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-vc7sq" podUID="95ab78ae-ca97-4cab-9490-03b0a50f740c" Jan 14 01:09:43.502668 systemd[1]: cri-containerd-2cf0ce44d3607542dcd9098ffb524edc9046a98eb65edf65a62064d187fc264f.scope: Deactivated successfully. Jan 14 01:09:43.502950 systemd[1]: cri-containerd-2cf0ce44d3607542dcd9098ffb524edc9046a98eb65edf65a62064d187fc264f.scope: Consumed 2.402s CPU time, 24.4M memory peak. Jan 14 01:09:43.502000 audit: BPF prog-id=271 op=LOAD Jan 14 01:09:43.502000 audit: BPF prog-id=92 op=UNLOAD Jan 14 01:09:43.505216 containerd[1695]: time="2026-01-14T01:09:43.505038697Z" level=info msg="received container exit event container_id:\"2cf0ce44d3607542dcd9098ffb524edc9046a98eb65edf65a62064d187fc264f\" id:\"2cf0ce44d3607542dcd9098ffb524edc9046a98eb65edf65a62064d187fc264f\" pid:3163 exit_status:1 exited_at:{seconds:1768352983 nanos:504635550}" Jan 14 01:09:43.506000 audit: BPF prog-id=112 op=UNLOAD Jan 14 01:09:43.506000 audit: BPF prog-id=116 op=UNLOAD Jan 14 01:09:43.526167 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-2cf0ce44d3607542dcd9098ffb524edc9046a98eb65edf65a62064d187fc264f-rootfs.mount: Deactivated successfully. Jan 14 01:09:43.708796 kubelet[3316]: I0114 01:09:43.708735 3316 scope.go:117] "RemoveContainer" containerID="2cf0ce44d3607542dcd9098ffb524edc9046a98eb65edf65a62064d187fc264f" Jan 14 01:09:43.711683 containerd[1695]: time="2026-01-14T01:09:43.711224189Z" level=info msg="CreateContainer within sandbox \"486f6e84de8abf16564b2e9ece5a1a1fcf44827b575bce934706ac75e82f4740\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Jan 14 01:09:43.726253 containerd[1695]: time="2026-01-14T01:09:43.725602299Z" level=info msg="Container 55387f462eb35b9722ed531e9796436335c709e9402aed4789a9d3cbe7d5978e: CDI devices from CRI Config.CDIDevices: []" Jan 14 01:09:43.735126 containerd[1695]: time="2026-01-14T01:09:43.735101278Z" level=info msg="CreateContainer within sandbox \"486f6e84de8abf16564b2e9ece5a1a1fcf44827b575bce934706ac75e82f4740\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"55387f462eb35b9722ed531e9796436335c709e9402aed4789a9d3cbe7d5978e\"" Jan 14 01:09:43.735632 containerd[1695]: time="2026-01-14T01:09:43.735611411Z" level=info msg="StartContainer for \"55387f462eb35b9722ed531e9796436335c709e9402aed4789a9d3cbe7d5978e\"" Jan 14 01:09:43.736502 containerd[1695]: time="2026-01-14T01:09:43.736482684Z" level=info msg="connecting to shim 55387f462eb35b9722ed531e9796436335c709e9402aed4789a9d3cbe7d5978e" address="unix:///run/containerd/s/dc4304f06e6fba422e37736d7f49e28eb67a61c4050768112b4d4b4bd5abc2ca" protocol=ttrpc version=3 Jan 14 01:09:43.758282 systemd[1]: Started cri-containerd-55387f462eb35b9722ed531e9796436335c709e9402aed4789a9d3cbe7d5978e.scope - libcontainer container 55387f462eb35b9722ed531e9796436335c709e9402aed4789a9d3cbe7d5978e. Jan 14 01:09:43.769000 audit: BPF prog-id=272 op=LOAD Jan 14 01:09:43.771551 kernel: kauditd_printk_skb: 44 callbacks suppressed Jan 14 01:09:43.771602 kernel: audit: type=1334 audit(1768352983.769:860): prog-id=272 op=LOAD Jan 14 01:09:43.772000 audit: BPF prog-id=273 op=LOAD Jan 14 01:09:43.772000 audit[6020]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=2989 pid=6020 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:09:43.775913 kernel: audit: type=1334 audit(1768352983.772:861): prog-id=273 op=LOAD Jan 14 01:09:43.775956 kernel: audit: type=1300 audit(1768352983.772:861): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=2989 pid=6020 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:09:43.772000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3535333837663436326562333562393732326564353331653937393634 Jan 14 01:09:43.779888 kernel: audit: type=1327 audit(1768352983.772:861): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3535333837663436326562333562393732326564353331653937393634 Jan 14 01:09:43.772000 audit: BPF prog-id=273 op=UNLOAD Jan 14 01:09:43.783439 kernel: audit: type=1334 audit(1768352983.772:862): prog-id=273 op=UNLOAD Jan 14 01:09:43.772000 audit[6020]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2989 pid=6020 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:09:43.784969 kernel: audit: type=1300 audit(1768352983.772:862): arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2989 pid=6020 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:09:43.772000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3535333837663436326562333562393732326564353331653937393634 Jan 14 01:09:43.789203 kernel: audit: type=1327 audit(1768352983.772:862): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3535333837663436326562333562393732326564353331653937393634 Jan 14 01:09:43.772000 audit: BPF prog-id=274 op=LOAD Jan 14 01:09:43.792391 kernel: audit: type=1334 audit(1768352983.772:863): prog-id=274 op=LOAD Jan 14 01:09:43.772000 audit[6020]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=2989 pid=6020 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:09:43.797070 kernel: audit: type=1300 audit(1768352983.772:863): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=2989 pid=6020 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:09:43.797135 kernel: audit: type=1327 audit(1768352983.772:863): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3535333837663436326562333562393732326564353331653937393634 Jan 14 01:09:43.772000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3535333837663436326562333562393732326564353331653937393634 Jan 14 01:09:43.772000 audit: BPF prog-id=275 op=LOAD Jan 14 01:09:43.772000 audit[6020]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=2989 pid=6020 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:09:43.772000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3535333837663436326562333562393732326564353331653937393634 Jan 14 01:09:43.772000 audit: BPF prog-id=275 op=UNLOAD Jan 14 01:09:43.772000 audit[6020]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2989 pid=6020 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:09:43.772000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3535333837663436326562333562393732326564353331653937393634 Jan 14 01:09:43.772000 audit: BPF prog-id=274 op=UNLOAD Jan 14 01:09:43.772000 audit[6020]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2989 pid=6020 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:09:43.772000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3535333837663436326562333562393732326564353331653937393634 Jan 14 01:09:43.772000 audit: BPF prog-id=276 op=LOAD Jan 14 01:09:43.772000 audit[6020]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=2989 pid=6020 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:09:43.772000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3535333837663436326562333562393732326564353331653937393634 Jan 14 01:09:43.817302 containerd[1695]: time="2026-01-14T01:09:43.817272498Z" level=info msg="StartContainer for \"55387f462eb35b9722ed531e9796436335c709e9402aed4789a9d3cbe7d5978e\" returns successfully" Jan 14 01:09:46.856216 kubelet[3316]: E0114 01:09:46.856177 3316 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-86656bdf75-c9kjr" podUID="81b57aeb-9645-4cab-a7a2-931a98fd5ce6" Jan 14 01:09:48.902722 kubelet[3316]: E0114 01:09:48.901476 3316 controller.go:195] "Failed to update lease" err="Put \"https://10.0.21.32:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4547-0-0-n-de0c74fc75?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)"