Jan 15 01:14:33.027461 kernel: Linux version 6.12.65-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.1_p20250801 p4) 14.3.1 20250801, GNU ld (Gentoo 2.45 p3) 2.45.0) #1 SMP PREEMPT_DYNAMIC Wed Jan 14 22:02:13 -00 2026 Jan 15 01:14:33.027491 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack verity.usrhash=1042e64ca7212ba2a277cb872bdf1dc4e195c9fb8110078c443b3efbd2488cb9 Jan 15 01:14:33.027501 kernel: BIOS-provided physical RAM map: Jan 15 01:14:33.027508 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ffff] usable Jan 15 01:14:33.027514 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000007fffff] usable Jan 15 01:14:33.027520 kernel: BIOS-e820: [mem 0x0000000000800000-0x0000000000807fff] ACPI NVS Jan 15 01:14:33.027530 kernel: BIOS-e820: [mem 0x0000000000808000-0x000000000080afff] usable Jan 15 01:14:33.027536 kernel: BIOS-e820: [mem 0x000000000080b000-0x000000000080bfff] ACPI NVS Jan 15 01:14:33.027542 kernel: BIOS-e820: [mem 0x000000000080c000-0x0000000000810fff] usable Jan 15 01:14:33.027549 kernel: BIOS-e820: [mem 0x0000000000811000-0x00000000008fffff] ACPI NVS Jan 15 01:14:33.027555 kernel: BIOS-e820: [mem 0x0000000000900000-0x000000007e93efff] usable Jan 15 01:14:33.027561 kernel: BIOS-e820: [mem 0x000000007e93f000-0x000000007e9fffff] reserved Jan 15 01:14:33.027568 kernel: BIOS-e820: [mem 0x000000007ea00000-0x000000007ec70fff] usable Jan 15 01:14:33.027574 kernel: BIOS-e820: [mem 0x000000007ec71000-0x000000007ed84fff] reserved Jan 15 01:14:33.027584 kernel: BIOS-e820: [mem 0x000000007ed85000-0x000000007f8ecfff] usable Jan 15 01:14:33.027590 kernel: BIOS-e820: [mem 0x000000007f8ed000-0x000000007fb6cfff] reserved Jan 15 01:14:33.027597 kernel: BIOS-e820: [mem 0x000000007fb6d000-0x000000007fb7efff] ACPI data Jan 15 01:14:33.027604 kernel: BIOS-e820: [mem 0x000000007fb7f000-0x000000007fbfefff] ACPI NVS Jan 15 01:14:33.027610 kernel: BIOS-e820: [mem 0x000000007fbff000-0x000000007feaefff] usable Jan 15 01:14:33.027617 kernel: BIOS-e820: [mem 0x000000007feaf000-0x000000007feb2fff] reserved Jan 15 01:14:33.027625 kernel: BIOS-e820: [mem 0x000000007feb3000-0x000000007feb4fff] ACPI NVS Jan 15 01:14:33.027632 kernel: BIOS-e820: [mem 0x000000007feb5000-0x000000007feebfff] usable Jan 15 01:14:33.027638 kernel: BIOS-e820: [mem 0x000000007feec000-0x000000007ff6ffff] reserved Jan 15 01:14:33.027645 kernel: BIOS-e820: [mem 0x000000007ff70000-0x000000007fffffff] ACPI NVS Jan 15 01:14:33.027651 kernel: BIOS-e820: [mem 0x00000000e0000000-0x00000000efffffff] reserved Jan 15 01:14:33.027658 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Jan 15 01:14:33.027665 kernel: BIOS-e820: [mem 0x00000000ffc00000-0x00000000ffffffff] reserved Jan 15 01:14:33.027671 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000017fffffff] usable Jan 15 01:14:33.027678 kernel: NX (Execute Disable) protection: active Jan 15 01:14:33.027685 kernel: APIC: Static calls initialized Jan 15 01:14:33.027691 kernel: e820: update [mem 0x7df7f018-0x7df88a57] usable ==> usable Jan 15 01:14:33.027700 kernel: e820: update [mem 0x7df57018-0x7df7e457] usable ==> usable Jan 15 01:14:33.027707 kernel: extended physical RAM map: Jan 15 01:14:33.027713 kernel: reserve setup_data: [mem 0x0000000000000000-0x000000000009ffff] usable Jan 15 01:14:33.027720 kernel: reserve setup_data: [mem 0x0000000000100000-0x00000000007fffff] usable Jan 15 01:14:33.027727 kernel: reserve setup_data: [mem 0x0000000000800000-0x0000000000807fff] ACPI NVS Jan 15 01:14:33.027733 kernel: reserve setup_data: [mem 0x0000000000808000-0x000000000080afff] usable Jan 15 01:14:33.027740 kernel: reserve setup_data: [mem 0x000000000080b000-0x000000000080bfff] ACPI NVS Jan 15 01:14:33.027746 kernel: reserve setup_data: [mem 0x000000000080c000-0x0000000000810fff] usable Jan 15 01:14:33.027753 kernel: reserve setup_data: [mem 0x0000000000811000-0x00000000008fffff] ACPI NVS Jan 15 01:14:33.027765 kernel: reserve setup_data: [mem 0x0000000000900000-0x000000007df57017] usable Jan 15 01:14:33.027772 kernel: reserve setup_data: [mem 0x000000007df57018-0x000000007df7e457] usable Jan 15 01:14:33.027779 kernel: reserve setup_data: [mem 0x000000007df7e458-0x000000007df7f017] usable Jan 15 01:14:33.027786 kernel: reserve setup_data: [mem 0x000000007df7f018-0x000000007df88a57] usable Jan 15 01:14:33.027794 kernel: reserve setup_data: [mem 0x000000007df88a58-0x000000007e93efff] usable Jan 15 01:14:33.027801 kernel: reserve setup_data: [mem 0x000000007e93f000-0x000000007e9fffff] reserved Jan 15 01:14:33.027808 kernel: reserve setup_data: [mem 0x000000007ea00000-0x000000007ec70fff] usable Jan 15 01:14:33.027815 kernel: reserve setup_data: [mem 0x000000007ec71000-0x000000007ed84fff] reserved Jan 15 01:14:33.027822 kernel: reserve setup_data: [mem 0x000000007ed85000-0x000000007f8ecfff] usable Jan 15 01:14:33.027829 kernel: reserve setup_data: [mem 0x000000007f8ed000-0x000000007fb6cfff] reserved Jan 15 01:14:33.027837 kernel: reserve setup_data: [mem 0x000000007fb6d000-0x000000007fb7efff] ACPI data Jan 15 01:14:33.027844 kernel: reserve setup_data: [mem 0x000000007fb7f000-0x000000007fbfefff] ACPI NVS Jan 15 01:14:33.027851 kernel: reserve setup_data: [mem 0x000000007fbff000-0x000000007feaefff] usable Jan 15 01:14:33.027858 kernel: reserve setup_data: [mem 0x000000007feaf000-0x000000007feb2fff] reserved Jan 15 01:14:33.027865 kernel: reserve setup_data: [mem 0x000000007feb3000-0x000000007feb4fff] ACPI NVS Jan 15 01:14:33.027873 kernel: reserve setup_data: [mem 0x000000007feb5000-0x000000007feebfff] usable Jan 15 01:14:33.027880 kernel: reserve setup_data: [mem 0x000000007feec000-0x000000007ff6ffff] reserved Jan 15 01:14:33.027887 kernel: reserve setup_data: [mem 0x000000007ff70000-0x000000007fffffff] ACPI NVS Jan 15 01:14:33.027894 kernel: reserve setup_data: [mem 0x00000000e0000000-0x00000000efffffff] reserved Jan 15 01:14:33.027901 kernel: reserve setup_data: [mem 0x00000000feffc000-0x00000000feffffff] reserved Jan 15 01:14:33.027908 kernel: reserve setup_data: [mem 0x00000000ffc00000-0x00000000ffffffff] reserved Jan 15 01:14:33.027915 kernel: reserve setup_data: [mem 0x0000000100000000-0x000000017fffffff] usable Jan 15 01:14:33.027922 kernel: efi: EFI v2.7 by EDK II Jan 15 01:14:33.027929 kernel: efi: SMBIOS=0x7f972000 ACPI=0x7fb7e000 ACPI 2.0=0x7fb7e014 MEMATTR=0x7dfd8018 RNG=0x7fb72018 Jan 15 01:14:33.027936 kernel: random: crng init done Jan 15 01:14:33.027943 kernel: efi: Remove mem139: MMIO range=[0xffc00000-0xffffffff] (4MB) from e820 map Jan 15 01:14:33.027952 kernel: e820: remove [mem 0xffc00000-0xffffffff] reserved Jan 15 01:14:33.027959 kernel: secureboot: Secure boot disabled Jan 15 01:14:33.027966 kernel: SMBIOS 2.8 present. Jan 15 01:14:33.027973 kernel: DMI: STACKIT Cloud OpenStack Nova/Standard PC (Q35 + ICH9, 2009), BIOS unknown 02/02/2022 Jan 15 01:14:33.027980 kernel: DMI: Memory slots populated: 1/1 Jan 15 01:14:33.027987 kernel: Hypervisor detected: KVM Jan 15 01:14:33.029938 kernel: last_pfn = 0x7feec max_arch_pfn = 0x10000000000 Jan 15 01:14:33.029953 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Jan 15 01:14:33.029963 kernel: kvm-clock: using sched offset of 5271862653 cycles Jan 15 01:14:33.029972 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Jan 15 01:14:33.029984 kernel: tsc: Detected 2294.608 MHz processor Jan 15 01:14:33.029992 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Jan 15 01:14:33.030000 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Jan 15 01:14:33.030008 kernel: last_pfn = 0x180000 max_arch_pfn = 0x10000000000 Jan 15 01:14:33.030015 kernel: MTRR map: 4 entries (2 fixed + 2 variable; max 18), built from 8 variable MTRRs Jan 15 01:14:33.030023 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Jan 15 01:14:33.030031 kernel: last_pfn = 0x7feec max_arch_pfn = 0x10000000000 Jan 15 01:14:33.030038 kernel: Using GB pages for direct mapping Jan 15 01:14:33.030048 kernel: ACPI: Early table checksum verification disabled Jan 15 01:14:33.030055 kernel: ACPI: RSDP 0x000000007FB7E014 000024 (v02 BOCHS ) Jan 15 01:14:33.030063 kernel: ACPI: XSDT 0x000000007FB7D0E8 00004C (v01 BOCHS BXPC 00000001 01000013) Jan 15 01:14:33.030071 kernel: ACPI: FACP 0x000000007FB77000 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jan 15 01:14:33.030078 kernel: ACPI: DSDT 0x000000007FB78000 00423C (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 15 01:14:33.030086 kernel: ACPI: FACS 0x000000007FBDD000 000040 Jan 15 01:14:33.030093 kernel: ACPI: APIC 0x000000007FB76000 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jan 15 01:14:33.030102 kernel: ACPI: MCFG 0x000000007FB75000 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 15 01:14:33.030110 kernel: ACPI: WAET 0x000000007FB74000 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 15 01:14:33.030117 kernel: ACPI: BGRT 0x000000007FB73000 000038 (v01 INTEL EDK2 00000002 01000013) Jan 15 01:14:33.030125 kernel: ACPI: Reserving FACP table memory at [mem 0x7fb77000-0x7fb770f3] Jan 15 01:14:33.030132 kernel: ACPI: Reserving DSDT table memory at [mem 0x7fb78000-0x7fb7c23b] Jan 15 01:14:33.030140 kernel: ACPI: Reserving FACS table memory at [mem 0x7fbdd000-0x7fbdd03f] Jan 15 01:14:33.030148 kernel: ACPI: Reserving APIC table memory at [mem 0x7fb76000-0x7fb7607f] Jan 15 01:14:33.030157 kernel: ACPI: Reserving MCFG table memory at [mem 0x7fb75000-0x7fb7503b] Jan 15 01:14:33.030164 kernel: ACPI: Reserving WAET table memory at [mem 0x7fb74000-0x7fb74027] Jan 15 01:14:33.030171 kernel: ACPI: Reserving BGRT table memory at [mem 0x7fb73000-0x7fb73037] Jan 15 01:14:33.030179 kernel: No NUMA configuration found Jan 15 01:14:33.030186 kernel: Faking a node at [mem 0x0000000000000000-0x000000017fffffff] Jan 15 01:14:33.030194 kernel: NODE_DATA(0) allocated [mem 0x17fff8dc0-0x17fffffff] Jan 15 01:14:33.030202 kernel: Zone ranges: Jan 15 01:14:33.030210 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Jan 15 01:14:33.030219 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] Jan 15 01:14:33.030227 kernel: Normal [mem 0x0000000100000000-0x000000017fffffff] Jan 15 01:14:33.030234 kernel: Device empty Jan 15 01:14:33.030242 kernel: Movable zone start for each node Jan 15 01:14:33.030249 kernel: Early memory node ranges Jan 15 01:14:33.030257 kernel: node 0: [mem 0x0000000000001000-0x000000000009ffff] Jan 15 01:14:33.030264 kernel: node 0: [mem 0x0000000000100000-0x00000000007fffff] Jan 15 01:14:33.030273 kernel: node 0: [mem 0x0000000000808000-0x000000000080afff] Jan 15 01:14:33.030280 kernel: node 0: [mem 0x000000000080c000-0x0000000000810fff] Jan 15 01:14:33.030288 kernel: node 0: [mem 0x0000000000900000-0x000000007e93efff] Jan 15 01:14:33.030305 kernel: node 0: [mem 0x000000007ea00000-0x000000007ec70fff] Jan 15 01:14:33.030313 kernel: node 0: [mem 0x000000007ed85000-0x000000007f8ecfff] Jan 15 01:14:33.030327 kernel: node 0: [mem 0x000000007fbff000-0x000000007feaefff] Jan 15 01:14:33.030337 kernel: node 0: [mem 0x000000007feb5000-0x000000007feebfff] Jan 15 01:14:33.030345 kernel: node 0: [mem 0x0000000100000000-0x000000017fffffff] Jan 15 01:14:33.030353 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000017fffffff] Jan 15 01:14:33.030361 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Jan 15 01:14:33.030371 kernel: On node 0, zone DMA: 96 pages in unavailable ranges Jan 15 01:14:33.030379 kernel: On node 0, zone DMA: 8 pages in unavailable ranges Jan 15 01:14:33.030388 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Jan 15 01:14:33.030396 kernel: On node 0, zone DMA: 239 pages in unavailable ranges Jan 15 01:14:33.030406 kernel: On node 0, zone DMA32: 193 pages in unavailable ranges Jan 15 01:14:33.030414 kernel: On node 0, zone DMA32: 276 pages in unavailable ranges Jan 15 01:14:33.030422 kernel: On node 0, zone DMA32: 786 pages in unavailable ranges Jan 15 01:14:33.030430 kernel: On node 0, zone DMA32: 6 pages in unavailable ranges Jan 15 01:14:33.030438 kernel: On node 0, zone Normal: 276 pages in unavailable ranges Jan 15 01:14:33.030446 kernel: ACPI: PM-Timer IO Port: 0x608 Jan 15 01:14:33.030455 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Jan 15 01:14:33.030465 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Jan 15 01:14:33.030473 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Jan 15 01:14:33.030481 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Jan 15 01:14:33.030489 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Jan 15 01:14:33.030497 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Jan 15 01:14:33.030505 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Jan 15 01:14:33.030513 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Jan 15 01:14:33.030523 kernel: TSC deadline timer available Jan 15 01:14:33.030531 kernel: CPU topo: Max. logical packages: 2 Jan 15 01:14:33.030539 kernel: CPU topo: Max. logical dies: 2 Jan 15 01:14:33.030547 kernel: CPU topo: Max. dies per package: 1 Jan 15 01:14:33.030556 kernel: CPU topo: Max. threads per core: 1 Jan 15 01:14:33.030564 kernel: CPU topo: Num. cores per package: 1 Jan 15 01:14:33.030572 kernel: CPU topo: Num. threads per package: 1 Jan 15 01:14:33.030580 kernel: CPU topo: Allowing 2 present CPUs plus 0 hotplug CPUs Jan 15 01:14:33.030589 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Jan 15 01:14:33.030597 kernel: kvm-guest: KVM setup pv remote TLB flush Jan 15 01:14:33.030606 kernel: kvm-guest: setup PV sched yield Jan 15 01:14:33.030614 kernel: [mem 0x80000000-0xdfffffff] available for PCI devices Jan 15 01:14:33.030623 kernel: Booting paravirtualized kernel on KVM Jan 15 01:14:33.030631 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Jan 15 01:14:33.030639 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 Jan 15 01:14:33.030649 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u1048576 Jan 15 01:14:33.030657 kernel: pcpu-alloc: s207832 r8192 d29736 u1048576 alloc=1*2097152 Jan 15 01:14:33.030665 kernel: pcpu-alloc: [0] 0 1 Jan 15 01:14:33.030673 kernel: kvm-guest: PV spinlocks enabled Jan 15 01:14:33.030681 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Jan 15 01:14:33.030690 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack verity.usrhash=1042e64ca7212ba2a277cb872bdf1dc4e195c9fb8110078c443b3efbd2488cb9 Jan 15 01:14:33.030699 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Jan 15 01:14:33.030709 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Jan 15 01:14:33.030717 kernel: Fallback order for Node 0: 0 Jan 15 01:14:33.030725 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1046694 Jan 15 01:14:33.030734 kernel: Policy zone: Normal Jan 15 01:14:33.030742 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Jan 15 01:14:33.030750 kernel: software IO TLB: area num 2. Jan 15 01:14:33.030758 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Jan 15 01:14:33.030768 kernel: ftrace: allocating 40097 entries in 157 pages Jan 15 01:14:33.030776 kernel: ftrace: allocated 157 pages with 5 groups Jan 15 01:14:33.030784 kernel: Dynamic Preempt: voluntary Jan 15 01:14:33.030792 kernel: rcu: Preemptible hierarchical RCU implementation. Jan 15 01:14:33.030801 kernel: rcu: RCU event tracing is enabled. Jan 15 01:14:33.030809 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Jan 15 01:14:33.030817 kernel: Trampoline variant of Tasks RCU enabled. Jan 15 01:14:33.030827 kernel: Rude variant of Tasks RCU enabled. Jan 15 01:14:33.030835 kernel: Tracing variant of Tasks RCU enabled. Jan 15 01:14:33.030843 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Jan 15 01:14:33.030851 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Jan 15 01:14:33.030859 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jan 15 01:14:33.030868 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jan 15 01:14:33.030876 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jan 15 01:14:33.030884 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 16 Jan 15 01:14:33.030893 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Jan 15 01:14:33.030901 kernel: Console: colour dummy device 80x25 Jan 15 01:14:33.030909 kernel: printk: legacy console [tty0] enabled Jan 15 01:14:33.030917 kernel: printk: legacy console [ttyS0] enabled Jan 15 01:14:33.030925 kernel: ACPI: Core revision 20240827 Jan 15 01:14:33.030934 kernel: APIC: Switch to symmetric I/O mode setup Jan 15 01:14:33.030942 kernel: x2apic enabled Jan 15 01:14:33.030952 kernel: APIC: Switched APIC routing to: physical x2apic Jan 15 01:14:33.030960 kernel: kvm-guest: APIC: send_IPI_mask() replaced with kvm_send_ipi_mask() Jan 15 01:14:33.030968 kernel: kvm-guest: APIC: send_IPI_mask_allbutself() replaced with kvm_send_ipi_mask_allbutself() Jan 15 01:14:33.030976 kernel: kvm-guest: setup PV IPIs Jan 15 01:14:33.030985 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x21134f58f0d, max_idle_ns: 440795217993 ns Jan 15 01:14:33.030993 kernel: Calibrating delay loop (skipped) preset value.. 4589.21 BogoMIPS (lpj=2294608) Jan 15 01:14:33.031001 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Jan 15 01:14:33.031011 kernel: Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 Jan 15 01:14:33.031019 kernel: Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 Jan 15 01:14:33.031026 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Jan 15 01:14:33.031033 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on syscall and VM exit Jan 15 01:14:33.031041 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Jan 15 01:14:33.031049 kernel: Spectre V2 : Spectre v2 / PBRSB-eIBRS: Retire a single CALL on VMEXIT Jan 15 01:14:33.031056 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Jan 15 01:14:33.031064 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Jan 15 01:14:33.031072 kernel: TAA: Mitigation: Clear CPU buffers Jan 15 01:14:33.031079 kernel: MMIO Stale Data: Mitigation: Clear CPU buffers Jan 15 01:14:33.031088 kernel: active return thunk: its_return_thunk Jan 15 01:14:33.031096 kernel: ITS: Mitigation: Aligned branch/return thunks Jan 15 01:14:33.031103 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Jan 15 01:14:33.031111 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Jan 15 01:14:33.031119 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Jan 15 01:14:33.031126 kernel: x86/fpu: Supporting XSAVE feature 0x020: 'AVX-512 opmask' Jan 15 01:14:33.031134 kernel: x86/fpu: Supporting XSAVE feature 0x040: 'AVX-512 Hi256' Jan 15 01:14:33.031141 kernel: x86/fpu: Supporting XSAVE feature 0x080: 'AVX-512 ZMM_Hi256' Jan 15 01:14:33.031149 kernel: x86/fpu: Supporting XSAVE feature 0x200: 'Protection Keys User registers' Jan 15 01:14:33.031157 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Jan 15 01:14:33.031166 kernel: x86/fpu: xstate_offset[5]: 832, xstate_sizes[5]: 64 Jan 15 01:14:33.031173 kernel: x86/fpu: xstate_offset[6]: 896, xstate_sizes[6]: 512 Jan 15 01:14:33.031181 kernel: x86/fpu: xstate_offset[7]: 1408, xstate_sizes[7]: 1024 Jan 15 01:14:33.031188 kernel: x86/fpu: xstate_offset[9]: 2432, xstate_sizes[9]: 8 Jan 15 01:14:33.031196 kernel: x86/fpu: Enabled xstate features 0x2e7, context size is 2440 bytes, using 'compacted' format. Jan 15 01:14:33.031203 kernel: Freeing SMP alternatives memory: 32K Jan 15 01:14:33.031211 kernel: pid_max: default: 32768 minimum: 301 Jan 15 01:14:33.031218 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Jan 15 01:14:33.031226 kernel: landlock: Up and running. Jan 15 01:14:33.031233 kernel: SELinux: Initializing. Jan 15 01:14:33.031241 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Jan 15 01:14:33.031250 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Jan 15 01:14:33.031258 kernel: smpboot: CPU0: Intel(R) Xeon(R) Silver 4316 CPU @ 2.30GHz (family: 0x6, model: 0x6a, stepping: 0x6) Jan 15 01:14:33.031266 kernel: Performance Events: PEBS fmt0-, Icelake events, full-width counters, Intel PMU driver. Jan 15 01:14:33.031274 kernel: ... version: 2 Jan 15 01:14:33.031282 kernel: ... bit width: 48 Jan 15 01:14:33.031296 kernel: ... generic registers: 8 Jan 15 01:14:33.031305 kernel: ... value mask: 0000ffffffffffff Jan 15 01:14:33.031313 kernel: ... max period: 00007fffffffffff Jan 15 01:14:33.031323 kernel: ... fixed-purpose events: 3 Jan 15 01:14:33.031331 kernel: ... event mask: 00000007000000ff Jan 15 01:14:33.031339 kernel: signal: max sigframe size: 3632 Jan 15 01:14:33.031347 kernel: rcu: Hierarchical SRCU implementation. Jan 15 01:14:33.031355 kernel: rcu: Max phase no-delay instances is 400. Jan 15 01:14:33.031363 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Jan 15 01:14:33.031371 kernel: smp: Bringing up secondary CPUs ... Jan 15 01:14:33.031381 kernel: smpboot: x86: Booting SMP configuration: Jan 15 01:14:33.031389 kernel: .... node #0, CPUs: #1 Jan 15 01:14:33.031397 kernel: smp: Brought up 1 node, 2 CPUs Jan 15 01:14:33.031405 kernel: smpboot: Total of 2 processors activated (9178.43 BogoMIPS) Jan 15 01:14:33.031414 kernel: Memory: 3971812K/4186776K available (14336K kernel code, 2445K rwdata, 29896K rodata, 15432K init, 2608K bss, 210084K reserved, 0K cma-reserved) Jan 15 01:14:33.031422 kernel: devtmpfs: initialized Jan 15 01:14:33.031430 kernel: x86/mm: Memory block size: 128MB Jan 15 01:14:33.031440 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x00800000-0x00807fff] (32768 bytes) Jan 15 01:14:33.031448 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x0080b000-0x0080bfff] (4096 bytes) Jan 15 01:14:33.031456 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x00811000-0x008fffff] (978944 bytes) Jan 15 01:14:33.031465 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7fb7f000-0x7fbfefff] (524288 bytes) Jan 15 01:14:33.031473 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7feb3000-0x7feb4fff] (8192 bytes) Jan 15 01:14:33.031481 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7ff70000-0x7fffffff] (589824 bytes) Jan 15 01:14:33.031489 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Jan 15 01:14:33.031499 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Jan 15 01:14:33.031507 kernel: pinctrl core: initialized pinctrl subsystem Jan 15 01:14:33.031515 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Jan 15 01:14:33.031523 kernel: audit: initializing netlink subsys (disabled) Jan 15 01:14:33.031531 kernel: audit: type=2000 audit(1768439669.929:1): state=initialized audit_enabled=0 res=1 Jan 15 01:14:33.031539 kernel: thermal_sys: Registered thermal governor 'step_wise' Jan 15 01:14:33.031547 kernel: thermal_sys: Registered thermal governor 'user_space' Jan 15 01:14:33.031556 kernel: cpuidle: using governor menu Jan 15 01:14:33.031564 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Jan 15 01:14:33.031572 kernel: dca service started, version 1.12.1 Jan 15 01:14:33.031580 kernel: PCI: ECAM [mem 0xe0000000-0xefffffff] (base 0xe0000000) for domain 0000 [bus 00-ff] Jan 15 01:14:33.031588 kernel: PCI: Using configuration type 1 for base access Jan 15 01:14:33.031596 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Jan 15 01:14:33.031605 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Jan 15 01:14:33.031614 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Jan 15 01:14:33.031622 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Jan 15 01:14:33.031630 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Jan 15 01:14:33.031638 kernel: ACPI: Added _OSI(Module Device) Jan 15 01:14:33.031647 kernel: ACPI: Added _OSI(Processor Device) Jan 15 01:14:33.031654 kernel: ACPI: Added _OSI(Processor Aggregator Device) Jan 15 01:14:33.031663 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Jan 15 01:14:33.031672 kernel: ACPI: Interpreter enabled Jan 15 01:14:33.031680 kernel: ACPI: PM: (supports S0 S3 S5) Jan 15 01:14:33.031689 kernel: ACPI: Using IOAPIC for interrupt routing Jan 15 01:14:33.031697 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Jan 15 01:14:33.031705 kernel: PCI: Using E820 reservations for host bridge windows Jan 15 01:14:33.031713 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Jan 15 01:14:33.031721 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Jan 15 01:14:33.031902 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Jan 15 01:14:33.032010 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] Jan 15 01:14:33.032153 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] Jan 15 01:14:33.032164 kernel: PCI host bridge to bus 0000:00 Jan 15 01:14:33.032265 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Jan 15 01:14:33.032376 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Jan 15 01:14:33.032467 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Jan 15 01:14:33.032554 kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xdfffffff window] Jan 15 01:14:33.032641 kernel: pci_bus 0000:00: root bus resource [mem 0xf0000000-0xfebfffff window] Jan 15 01:14:33.032727 kernel: pci_bus 0000:00: root bus resource [mem 0x380000000000-0x38e800003fff window] Jan 15 01:14:33.032814 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Jan 15 01:14:33.032943 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 conventional PCI endpoint Jan 15 01:14:33.033054 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint Jan 15 01:14:33.033153 kernel: pci 0000:00:01.0: BAR 0 [mem 0x80000000-0x807fffff pref] Jan 15 01:14:33.033254 kernel: pci 0000:00:01.0: BAR 2 [mem 0x38e800000000-0x38e800003fff 64bit pref] Jan 15 01:14:33.034315 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8439e000-0x8439efff] Jan 15 01:14:33.034433 kernel: pci 0000:00:01.0: ROM [mem 0xffff0000-0xffffffff pref] Jan 15 01:14:33.034537 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Jan 15 01:14:33.034644 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 01:14:33.034744 kernel: pci 0000:00:02.0: BAR 0 [mem 0x8439d000-0x8439dfff] Jan 15 01:14:33.034841 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Jan 15 01:14:33.034939 kernel: pci 0000:00:02.0: bridge window [io 0x6000-0x6fff] Jan 15 01:14:33.035036 kernel: pci 0000:00:02.0: bridge window [mem 0x84000000-0x842fffff] Jan 15 01:14:33.035138 kernel: pci 0000:00:02.0: bridge window [mem 0x380000000000-0x3807ffffffff 64bit pref] Jan 15 01:14:33.035245 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 01:14:33.035360 kernel: pci 0000:00:02.1: BAR 0 [mem 0x8439c000-0x8439cfff] Jan 15 01:14:33.035457 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Jan 15 01:14:33.035552 kernel: pci 0000:00:02.1: bridge window [mem 0x83e00000-0x83ffffff] Jan 15 01:14:33.035653 kernel: pci 0000:00:02.1: bridge window [mem 0x380800000000-0x380fffffffff 64bit pref] Jan 15 01:14:33.035756 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 01:14:33.035854 kernel: pci 0000:00:02.2: BAR 0 [mem 0x8439b000-0x8439bfff] Jan 15 01:14:33.035949 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Jan 15 01:14:33.036045 kernel: pci 0000:00:02.2: bridge window [mem 0x83c00000-0x83dfffff] Jan 15 01:14:33.036140 kernel: pci 0000:00:02.2: bridge window [mem 0x381000000000-0x3817ffffffff 64bit pref] Jan 15 01:14:33.036247 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 01:14:33.037177 kernel: pci 0000:00:02.3: BAR 0 [mem 0x8439a000-0x8439afff] Jan 15 01:14:33.037308 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Jan 15 01:14:33.037413 kernel: pci 0000:00:02.3: bridge window [mem 0x83a00000-0x83bfffff] Jan 15 01:14:33.037511 kernel: pci 0000:00:02.3: bridge window [mem 0x381800000000-0x381fffffffff 64bit pref] Jan 15 01:14:33.037617 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 01:14:33.037718 kernel: pci 0000:00:02.4: BAR 0 [mem 0x84399000-0x84399fff] Jan 15 01:14:33.038079 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Jan 15 01:14:33.038178 kernel: pci 0000:00:02.4: bridge window [mem 0x83800000-0x839fffff] Jan 15 01:14:33.038274 kernel: pci 0000:00:02.4: bridge window [mem 0x382000000000-0x3827ffffffff 64bit pref] Jan 15 01:14:33.038398 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 01:14:33.038496 kernel: pci 0000:00:02.5: BAR 0 [mem 0x84398000-0x84398fff] Jan 15 01:14:33.038595 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Jan 15 01:14:33.038690 kernel: pci 0000:00:02.5: bridge window [mem 0x83600000-0x837fffff] Jan 15 01:14:33.038786 kernel: pci 0000:00:02.5: bridge window [mem 0x382800000000-0x382fffffffff 64bit pref] Jan 15 01:14:33.038893 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 01:14:33.038992 kernel: pci 0000:00:02.6: BAR 0 [mem 0x84397000-0x84397fff] Jan 15 01:14:33.039092 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Jan 15 01:14:33.039656 kernel: pci 0000:00:02.6: bridge window [mem 0x83400000-0x835fffff] Jan 15 01:14:33.039759 kernel: pci 0000:00:02.6: bridge window [mem 0x383000000000-0x3837ffffffff 64bit pref] Jan 15 01:14:33.039861 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 01:14:33.039958 kernel: pci 0000:00:02.7: BAR 0 [mem 0x84396000-0x84396fff] Jan 15 01:14:33.040053 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Jan 15 01:14:33.040153 kernel: pci 0000:00:02.7: bridge window [mem 0x83200000-0x833fffff] Jan 15 01:14:33.040249 kernel: pci 0000:00:02.7: bridge window [mem 0x383800000000-0x383fffffffff 64bit pref] Jan 15 01:14:33.040484 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 01:14:33.040587 kernel: pci 0000:00:03.0: BAR 0 [mem 0x84395000-0x84395fff] Jan 15 01:14:33.040708 kernel: pci 0000:00:03.0: PCI bridge to [bus 0a] Jan 15 01:14:33.040861 kernel: pci 0000:00:03.0: bridge window [mem 0x83000000-0x831fffff] Jan 15 01:14:33.040973 kernel: pci 0000:00:03.0: bridge window [mem 0x384000000000-0x3847ffffffff 64bit pref] Jan 15 01:14:33.041080 kernel: pci 0000:00:03.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 01:14:33.042354 kernel: pci 0000:00:03.1: BAR 0 [mem 0x84394000-0x84394fff] Jan 15 01:14:33.042472 kernel: pci 0000:00:03.1: PCI bridge to [bus 0b] Jan 15 01:14:33.042571 kernel: pci 0000:00:03.1: bridge window [mem 0x82e00000-0x82ffffff] Jan 15 01:14:33.042670 kernel: pci 0000:00:03.1: bridge window [mem 0x384800000000-0x384fffffffff 64bit pref] Jan 15 01:14:33.042776 kernel: pci 0000:00:03.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 01:14:33.042873 kernel: pci 0000:00:03.2: BAR 0 [mem 0x84393000-0x84393fff] Jan 15 01:14:33.042968 kernel: pci 0000:00:03.2: PCI bridge to [bus 0c] Jan 15 01:14:33.043063 kernel: pci 0000:00:03.2: bridge window [mem 0x82c00000-0x82dfffff] Jan 15 01:14:33.043157 kernel: pci 0000:00:03.2: bridge window [mem 0x385000000000-0x3857ffffffff 64bit pref] Jan 15 01:14:33.043260 kernel: pci 0000:00:03.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 01:14:33.043369 kernel: pci 0000:00:03.3: BAR 0 [mem 0x84392000-0x84392fff] Jan 15 01:14:33.043469 kernel: pci 0000:00:03.3: PCI bridge to [bus 0d] Jan 15 01:14:33.043565 kernel: pci 0000:00:03.3: bridge window [mem 0x82a00000-0x82bfffff] Jan 15 01:14:33.043663 kernel: pci 0000:00:03.3: bridge window [mem 0x385800000000-0x385fffffffff 64bit pref] Jan 15 01:14:33.043774 kernel: pci 0000:00:03.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 01:14:33.043871 kernel: pci 0000:00:03.4: BAR 0 [mem 0x84391000-0x84391fff] Jan 15 01:14:33.043983 kernel: pci 0000:00:03.4: PCI bridge to [bus 0e] Jan 15 01:14:33.044081 kernel: pci 0000:00:03.4: bridge window [mem 0x82800000-0x829fffff] Jan 15 01:14:33.044176 kernel: pci 0000:00:03.4: bridge window [mem 0x386000000000-0x3867ffffffff 64bit pref] Jan 15 01:14:33.044279 kernel: pci 0000:00:03.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 01:14:33.044390 kernel: pci 0000:00:03.5: BAR 0 [mem 0x84390000-0x84390fff] Jan 15 01:14:33.044485 kernel: pci 0000:00:03.5: PCI bridge to [bus 0f] Jan 15 01:14:33.044579 kernel: pci 0000:00:03.5: bridge window [mem 0x82600000-0x827fffff] Jan 15 01:14:33.044674 kernel: pci 0000:00:03.5: bridge window [mem 0x386800000000-0x386fffffffff 64bit pref] Jan 15 01:14:33.044777 kernel: pci 0000:00:03.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 01:14:33.044887 kernel: pci 0000:00:03.6: BAR 0 [mem 0x8438f000-0x8438ffff] Jan 15 01:14:33.044984 kernel: pci 0000:00:03.6: PCI bridge to [bus 10] Jan 15 01:14:33.045097 kernel: pci 0000:00:03.6: bridge window [mem 0x82400000-0x825fffff] Jan 15 01:14:33.045193 kernel: pci 0000:00:03.6: bridge window [mem 0x387000000000-0x3877ffffffff 64bit pref] Jan 15 01:14:33.045304 kernel: pci 0000:00:03.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 01:14:33.045412 kernel: pci 0000:00:03.7: BAR 0 [mem 0x8438e000-0x8438efff] Jan 15 01:14:33.045511 kernel: pci 0000:00:03.7: PCI bridge to [bus 11] Jan 15 01:14:33.045605 kernel: pci 0000:00:03.7: bridge window [mem 0x82200000-0x823fffff] Jan 15 01:14:33.045699 kernel: pci 0000:00:03.7: bridge window [mem 0x387800000000-0x387fffffffff 64bit pref] Jan 15 01:14:33.045799 kernel: pci 0000:00:04.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 01:14:33.045895 kernel: pci 0000:00:04.0: BAR 0 [mem 0x8438d000-0x8438dfff] Jan 15 01:14:33.045989 kernel: pci 0000:00:04.0: PCI bridge to [bus 12] Jan 15 01:14:33.046087 kernel: pci 0000:00:04.0: bridge window [mem 0x82000000-0x821fffff] Jan 15 01:14:33.046182 kernel: pci 0000:00:04.0: bridge window [mem 0x388000000000-0x3887ffffffff 64bit pref] Jan 15 01:14:33.046281 kernel: pci 0000:00:04.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 01:14:33.046394 kernel: pci 0000:00:04.1: BAR 0 [mem 0x8438c000-0x8438cfff] Jan 15 01:14:33.046490 kernel: pci 0000:00:04.1: PCI bridge to [bus 13] Jan 15 01:14:33.046586 kernel: pci 0000:00:04.1: bridge window [mem 0x81e00000-0x81ffffff] Jan 15 01:14:33.046684 kernel: pci 0000:00:04.1: bridge window [mem 0x388800000000-0x388fffffffff 64bit pref] Jan 15 01:14:33.046787 kernel: pci 0000:00:04.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 01:14:33.046883 kernel: pci 0000:00:04.2: BAR 0 [mem 0x8438b000-0x8438bfff] Jan 15 01:14:33.046979 kernel: pci 0000:00:04.2: PCI bridge to [bus 14] Jan 15 01:14:33.047075 kernel: pci 0000:00:04.2: bridge window [mem 0x81c00000-0x81dfffff] Jan 15 01:14:33.047171 kernel: pci 0000:00:04.2: bridge window [mem 0x389000000000-0x3897ffffffff 64bit pref] Jan 15 01:14:33.047275 kernel: pci 0000:00:04.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 01:14:33.047391 kernel: pci 0000:00:04.3: BAR 0 [mem 0x8438a000-0x8438afff] Jan 15 01:14:33.047488 kernel: pci 0000:00:04.3: PCI bridge to [bus 15] Jan 15 01:14:33.047583 kernel: pci 0000:00:04.3: bridge window [mem 0x81a00000-0x81bfffff] Jan 15 01:14:33.047680 kernel: pci 0000:00:04.3: bridge window [mem 0x389800000000-0x389fffffffff 64bit pref] Jan 15 01:14:33.047785 kernel: pci 0000:00:04.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 01:14:33.047883 kernel: pci 0000:00:04.4: BAR 0 [mem 0x84389000-0x84389fff] Jan 15 01:14:33.047979 kernel: pci 0000:00:04.4: PCI bridge to [bus 16] Jan 15 01:14:33.048074 kernel: pci 0000:00:04.4: bridge window [mem 0x81800000-0x819fffff] Jan 15 01:14:33.048171 kernel: pci 0000:00:04.4: bridge window [mem 0x38a000000000-0x38a7ffffffff 64bit pref] Jan 15 01:14:33.048272 kernel: pci 0000:00:04.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 01:14:33.048401 kernel: pci 0000:00:04.5: BAR 0 [mem 0x84388000-0x84388fff] Jan 15 01:14:33.048499 kernel: pci 0000:00:04.5: PCI bridge to [bus 17] Jan 15 01:14:33.048595 kernel: pci 0000:00:04.5: bridge window [mem 0x81600000-0x817fffff] Jan 15 01:14:33.048691 kernel: pci 0000:00:04.5: bridge window [mem 0x38a800000000-0x38afffffffff 64bit pref] Jan 15 01:14:33.048791 kernel: pci 0000:00:04.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 01:14:33.048902 kernel: pci 0000:00:04.6: BAR 0 [mem 0x84387000-0x84387fff] Jan 15 01:14:33.048998 kernel: pci 0000:00:04.6: PCI bridge to [bus 18] Jan 15 01:14:33.049094 kernel: pci 0000:00:04.6: bridge window [mem 0x81400000-0x815fffff] Jan 15 01:14:33.049190 kernel: pci 0000:00:04.6: bridge window [mem 0x38b000000000-0x38b7ffffffff 64bit pref] Jan 15 01:14:33.049309 kernel: pci 0000:00:04.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 01:14:33.049413 kernel: pci 0000:00:04.7: BAR 0 [mem 0x84386000-0x84386fff] Jan 15 01:14:33.049508 kernel: pci 0000:00:04.7: PCI bridge to [bus 19] Jan 15 01:14:33.049605 kernel: pci 0000:00:04.7: bridge window [mem 0x81200000-0x813fffff] Jan 15 01:14:33.049699 kernel: pci 0000:00:04.7: bridge window [mem 0x38b800000000-0x38bfffffffff 64bit pref] Jan 15 01:14:33.049803 kernel: pci 0000:00:05.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 01:14:33.049902 kernel: pci 0000:00:05.0: BAR 0 [mem 0x84385000-0x84385fff] Jan 15 01:14:33.049997 kernel: pci 0000:00:05.0: PCI bridge to [bus 1a] Jan 15 01:14:33.050091 kernel: pci 0000:00:05.0: bridge window [mem 0x81000000-0x811fffff] Jan 15 01:14:33.050188 kernel: pci 0000:00:05.0: bridge window [mem 0x38c000000000-0x38c7ffffffff 64bit pref] Jan 15 01:14:33.050288 kernel: pci 0000:00:05.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 01:14:33.050395 kernel: pci 0000:00:05.1: BAR 0 [mem 0x84384000-0x84384fff] Jan 15 01:14:33.050493 kernel: pci 0000:00:05.1: PCI bridge to [bus 1b] Jan 15 01:14:33.050588 kernel: pci 0000:00:05.1: bridge window [mem 0x80e00000-0x80ffffff] Jan 15 01:14:33.050683 kernel: pci 0000:00:05.1: bridge window [mem 0x38c800000000-0x38cfffffffff 64bit pref] Jan 15 01:14:33.050783 kernel: pci 0000:00:05.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 01:14:33.050879 kernel: pci 0000:00:05.2: BAR 0 [mem 0x84383000-0x84383fff] Jan 15 01:14:33.050974 kernel: pci 0000:00:05.2: PCI bridge to [bus 1c] Jan 15 01:14:33.051072 kernel: pci 0000:00:05.2: bridge window [mem 0x80c00000-0x80dfffff] Jan 15 01:14:33.051167 kernel: pci 0000:00:05.2: bridge window [mem 0x38d000000000-0x38d7ffffffff 64bit pref] Jan 15 01:14:33.051272 kernel: pci 0000:00:05.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 01:14:33.051408 kernel: pci 0000:00:05.3: BAR 0 [mem 0x84382000-0x84382fff] Jan 15 01:14:33.051505 kernel: pci 0000:00:05.3: PCI bridge to [bus 1d] Jan 15 01:14:33.051682 kernel: pci 0000:00:05.3: bridge window [mem 0x80a00000-0x80bfffff] Jan 15 01:14:33.051784 kernel: pci 0000:00:05.3: bridge window [mem 0x38d800000000-0x38dfffffffff 64bit pref] Jan 15 01:14:33.051884 kernel: pci 0000:00:05.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 01:14:33.051980 kernel: pci 0000:00:05.4: BAR 0 [mem 0x84381000-0x84381fff] Jan 15 01:14:33.052075 kernel: pci 0000:00:05.4: PCI bridge to [bus 1e] Jan 15 01:14:33.052169 kernel: pci 0000:00:05.4: bridge window [mem 0x80800000-0x809fffff] Jan 15 01:14:33.052264 kernel: pci 0000:00:05.4: bridge window [mem 0x38e000000000-0x38e7ffffffff 64bit pref] Jan 15 01:14:33.052379 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 conventional PCI endpoint Jan 15 01:14:33.052476 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Jan 15 01:14:33.052581 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 conventional PCI endpoint Jan 15 01:14:33.055457 kernel: pci 0000:00:1f.2: BAR 4 [io 0x7040-0x705f] Jan 15 01:14:33.055567 kernel: pci 0000:00:1f.2: BAR 5 [mem 0x84380000-0x84380fff] Jan 15 01:14:33.055674 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 conventional PCI endpoint Jan 15 01:14:33.055776 kernel: pci 0000:00:1f.3: BAR 4 [io 0x7000-0x703f] Jan 15 01:14:33.055881 kernel: pci 0000:01:00.0: [1b36:000e] type 01 class 0x060400 PCIe to PCI/PCI-X bridge Jan 15 01:14:33.055980 kernel: pci 0000:01:00.0: BAR 0 [mem 0x84200000-0x842000ff 64bit] Jan 15 01:14:33.056077 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Jan 15 01:14:33.056174 kernel: pci 0000:01:00.0: bridge window [io 0x6000-0x6fff] Jan 15 01:14:33.056274 kernel: pci 0000:01:00.0: bridge window [mem 0x84000000-0x841fffff] Jan 15 01:14:33.057418 kernel: pci 0000:01:00.0: bridge window [mem 0x380000000000-0x3807ffffffff 64bit pref] Jan 15 01:14:33.057524 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Jan 15 01:14:33.057635 kernel: pci_bus 0000:02: extended config space not accessible Jan 15 01:14:33.057648 kernel: acpiphp: Slot [1] registered Jan 15 01:14:33.057657 kernel: acpiphp: Slot [0] registered Jan 15 01:14:33.057666 kernel: acpiphp: Slot [2] registered Jan 15 01:14:33.057677 kernel: acpiphp: Slot [3] registered Jan 15 01:14:33.057685 kernel: acpiphp: Slot [4] registered Jan 15 01:14:33.057693 kernel: acpiphp: Slot [5] registered Jan 15 01:14:33.057702 kernel: acpiphp: Slot [6] registered Jan 15 01:14:33.057710 kernel: acpiphp: Slot [7] registered Jan 15 01:14:33.057718 kernel: acpiphp: Slot [8] registered Jan 15 01:14:33.057727 kernel: acpiphp: Slot [9] registered Jan 15 01:14:33.057737 kernel: acpiphp: Slot [10] registered Jan 15 01:14:33.057745 kernel: acpiphp: Slot [11] registered Jan 15 01:14:33.057753 kernel: acpiphp: Slot [12] registered Jan 15 01:14:33.057762 kernel: acpiphp: Slot [13] registered Jan 15 01:14:33.057770 kernel: acpiphp: Slot [14] registered Jan 15 01:14:33.057778 kernel: acpiphp: Slot [15] registered Jan 15 01:14:33.057787 kernel: acpiphp: Slot [16] registered Jan 15 01:14:33.057797 kernel: acpiphp: Slot [17] registered Jan 15 01:14:33.057805 kernel: acpiphp: Slot [18] registered Jan 15 01:14:33.057814 kernel: acpiphp: Slot [19] registered Jan 15 01:14:33.057822 kernel: acpiphp: Slot [20] registered Jan 15 01:14:33.057830 kernel: acpiphp: Slot [21] registered Jan 15 01:14:33.057838 kernel: acpiphp: Slot [22] registered Jan 15 01:14:33.057847 kernel: acpiphp: Slot [23] registered Jan 15 01:14:33.057855 kernel: acpiphp: Slot [24] registered Jan 15 01:14:33.057865 kernel: acpiphp: Slot [25] registered Jan 15 01:14:33.057873 kernel: acpiphp: Slot [26] registered Jan 15 01:14:33.057881 kernel: acpiphp: Slot [27] registered Jan 15 01:14:33.057890 kernel: acpiphp: Slot [28] registered Jan 15 01:14:33.057898 kernel: acpiphp: Slot [29] registered Jan 15 01:14:33.057906 kernel: acpiphp: Slot [30] registered Jan 15 01:14:33.057914 kernel: acpiphp: Slot [31] registered Jan 15 01:14:33.058025 kernel: pci 0000:02:01.0: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint Jan 15 01:14:33.058129 kernel: pci 0000:02:01.0: BAR 4 [io 0x6000-0x601f] Jan 15 01:14:33.058228 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Jan 15 01:14:33.058238 kernel: acpiphp: Slot [0-2] registered Jan 15 01:14:33.058698 kernel: pci 0000:03:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Jan 15 01:14:33.058807 kernel: pci 0000:03:00.0: BAR 1 [mem 0x83e00000-0x83e00fff] Jan 15 01:14:33.058912 kernel: pci 0000:03:00.0: BAR 4 [mem 0x380800000000-0x380800003fff 64bit pref] Jan 15 01:14:33.059013 kernel: pci 0000:03:00.0: ROM [mem 0xfff80000-0xffffffff pref] Jan 15 01:14:33.059112 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Jan 15 01:14:33.059123 kernel: acpiphp: Slot [0-3] registered Jan 15 01:14:33.059226 kernel: pci 0000:04:00.0: [1af4:1042] type 00 class 0x010000 PCIe Endpoint Jan 15 01:14:33.061368 kernel: pci 0000:04:00.0: BAR 1 [mem 0x83c00000-0x83c00fff] Jan 15 01:14:33.061497 kernel: pci 0000:04:00.0: BAR 4 [mem 0x381000000000-0x381000003fff 64bit pref] Jan 15 01:14:33.061602 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Jan 15 01:14:33.061615 kernel: acpiphp: Slot [0-4] registered Jan 15 01:14:33.061722 kernel: pci 0000:05:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Jan 15 01:14:33.061825 kernel: pci 0000:05:00.0: BAR 4 [mem 0x381800000000-0x381800003fff 64bit pref] Jan 15 01:14:33.061924 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Jan 15 01:14:33.061938 kernel: acpiphp: Slot [0-5] registered Jan 15 01:14:33.062041 kernel: pci 0000:06:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Jan 15 01:14:33.062140 kernel: pci 0000:06:00.0: BAR 1 [mem 0x83800000-0x83800fff] Jan 15 01:14:33.062238 kernel: pci 0000:06:00.0: BAR 4 [mem 0x382000000000-0x382000003fff 64bit pref] Jan 15 01:14:33.062350 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Jan 15 01:14:33.062362 kernel: acpiphp: Slot [0-6] registered Jan 15 01:14:33.062461 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Jan 15 01:14:33.062472 kernel: acpiphp: Slot [0-7] registered Jan 15 01:14:33.062567 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Jan 15 01:14:33.062578 kernel: acpiphp: Slot [0-8] registered Jan 15 01:14:33.062674 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Jan 15 01:14:33.062685 kernel: acpiphp: Slot [0-9] registered Jan 15 01:14:33.062783 kernel: pci 0000:00:03.0: PCI bridge to [bus 0a] Jan 15 01:14:33.062794 kernel: acpiphp: Slot [0-10] registered Jan 15 01:14:33.062888 kernel: pci 0000:00:03.1: PCI bridge to [bus 0b] Jan 15 01:14:33.062899 kernel: acpiphp: Slot [0-11] registered Jan 15 01:14:33.062997 kernel: pci 0000:00:03.2: PCI bridge to [bus 0c] Jan 15 01:14:33.063008 kernel: acpiphp: Slot [0-12] registered Jan 15 01:14:33.063103 kernel: pci 0000:00:03.3: PCI bridge to [bus 0d] Jan 15 01:14:33.063117 kernel: acpiphp: Slot [0-13] registered Jan 15 01:14:33.063212 kernel: pci 0000:00:03.4: PCI bridge to [bus 0e] Jan 15 01:14:33.063223 kernel: acpiphp: Slot [0-14] registered Jan 15 01:14:33.064538 kernel: pci 0000:00:03.5: PCI bridge to [bus 0f] Jan 15 01:14:33.064558 kernel: acpiphp: Slot [0-15] registered Jan 15 01:14:33.064668 kernel: pci 0000:00:03.6: PCI bridge to [bus 10] Jan 15 01:14:33.064684 kernel: acpiphp: Slot [0-16] registered Jan 15 01:14:33.064785 kernel: pci 0000:00:03.7: PCI bridge to [bus 11] Jan 15 01:14:33.064796 kernel: acpiphp: Slot [0-17] registered Jan 15 01:14:33.064906 kernel: pci 0000:00:04.0: PCI bridge to [bus 12] Jan 15 01:14:33.064918 kernel: acpiphp: Slot [0-18] registered Jan 15 01:14:33.065016 kernel: pci 0000:00:04.1: PCI bridge to [bus 13] Jan 15 01:14:33.065030 kernel: acpiphp: Slot [0-19] registered Jan 15 01:14:33.065126 kernel: pci 0000:00:04.2: PCI bridge to [bus 14] Jan 15 01:14:33.065138 kernel: acpiphp: Slot [0-20] registered Jan 15 01:14:33.065234 kernel: pci 0000:00:04.3: PCI bridge to [bus 15] Jan 15 01:14:33.065245 kernel: acpiphp: Slot [0-21] registered Jan 15 01:14:33.065354 kernel: pci 0000:00:04.4: PCI bridge to [bus 16] Jan 15 01:14:33.065366 kernel: acpiphp: Slot [0-22] registered Jan 15 01:14:33.065463 kernel: pci 0000:00:04.5: PCI bridge to [bus 17] Jan 15 01:14:33.065475 kernel: acpiphp: Slot [0-23] registered Jan 15 01:14:33.065571 kernel: pci 0000:00:04.6: PCI bridge to [bus 18] Jan 15 01:14:33.065582 kernel: acpiphp: Slot [0-24] registered Jan 15 01:14:33.065678 kernel: pci 0000:00:04.7: PCI bridge to [bus 19] Jan 15 01:14:33.065689 kernel: acpiphp: Slot [0-25] registered Jan 15 01:14:33.065787 kernel: pci 0000:00:05.0: PCI bridge to [bus 1a] Jan 15 01:14:33.065798 kernel: acpiphp: Slot [0-26] registered Jan 15 01:14:33.065893 kernel: pci 0000:00:05.1: PCI bridge to [bus 1b] Jan 15 01:14:33.065905 kernel: acpiphp: Slot [0-27] registered Jan 15 01:14:33.065999 kernel: pci 0000:00:05.2: PCI bridge to [bus 1c] Jan 15 01:14:33.066011 kernel: acpiphp: Slot [0-28] registered Jan 15 01:14:33.066106 kernel: pci 0000:00:05.3: PCI bridge to [bus 1d] Jan 15 01:14:33.066120 kernel: acpiphp: Slot [0-29] registered Jan 15 01:14:33.066253 kernel: pci 0000:00:05.4: PCI bridge to [bus 1e] Jan 15 01:14:33.066265 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Jan 15 01:14:33.066274 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Jan 15 01:14:33.066282 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Jan 15 01:14:33.066302 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Jan 15 01:14:33.066313 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Jan 15 01:14:33.066321 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Jan 15 01:14:33.066330 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Jan 15 01:14:33.066338 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Jan 15 01:14:33.066347 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Jan 15 01:14:33.066355 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Jan 15 01:14:33.066364 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Jan 15 01:14:33.066374 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Jan 15 01:14:33.066383 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Jan 15 01:14:33.066391 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Jan 15 01:14:33.066400 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Jan 15 01:14:33.066409 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Jan 15 01:14:33.066418 kernel: iommu: Default domain type: Translated Jan 15 01:14:33.066426 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Jan 15 01:14:33.066435 kernel: efivars: Registered efivars operations Jan 15 01:14:33.066445 kernel: PCI: Using ACPI for IRQ routing Jan 15 01:14:33.066454 kernel: PCI: pci_cache_line_size set to 64 bytes Jan 15 01:14:33.066463 kernel: e820: reserve RAM buffer [mem 0x0080b000-0x008fffff] Jan 15 01:14:33.066471 kernel: e820: reserve RAM buffer [mem 0x00811000-0x008fffff] Jan 15 01:14:33.066479 kernel: e820: reserve RAM buffer [mem 0x7df57018-0x7fffffff] Jan 15 01:14:33.066487 kernel: e820: reserve RAM buffer [mem 0x7df7f018-0x7fffffff] Jan 15 01:14:33.066495 kernel: e820: reserve RAM buffer [mem 0x7e93f000-0x7fffffff] Jan 15 01:14:33.066506 kernel: e820: reserve RAM buffer [mem 0x7ec71000-0x7fffffff] Jan 15 01:14:33.066515 kernel: e820: reserve RAM buffer [mem 0x7f8ed000-0x7fffffff] Jan 15 01:14:33.066523 kernel: e820: reserve RAM buffer [mem 0x7feaf000-0x7fffffff] Jan 15 01:14:33.066531 kernel: e820: reserve RAM buffer [mem 0x7feec000-0x7fffffff] Jan 15 01:14:33.066631 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Jan 15 01:14:33.066727 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Jan 15 01:14:33.066826 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Jan 15 01:14:33.066837 kernel: vgaarb: loaded Jan 15 01:14:33.066846 kernel: clocksource: Switched to clocksource kvm-clock Jan 15 01:14:33.066854 kernel: VFS: Disk quotas dquot_6.6.0 Jan 15 01:14:33.066863 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Jan 15 01:14:33.066872 kernel: pnp: PnP ACPI init Jan 15 01:14:33.066987 kernel: system 00:04: [mem 0xe0000000-0xefffffff window] has been reserved Jan 15 01:14:33.067002 kernel: pnp: PnP ACPI: found 5 devices Jan 15 01:14:33.067011 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Jan 15 01:14:33.067020 kernel: NET: Registered PF_INET protocol family Jan 15 01:14:33.067028 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Jan 15 01:14:33.067037 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Jan 15 01:14:33.067045 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Jan 15 01:14:33.067054 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Jan 15 01:14:33.067065 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Jan 15 01:14:33.067073 kernel: TCP: Hash tables configured (established 32768 bind 32768) Jan 15 01:14:33.067081 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Jan 15 01:14:33.067091 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Jan 15 01:14:33.067099 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Jan 15 01:14:33.067107 kernel: NET: Registered PF_XDP protocol family Jan 15 01:14:33.067211 kernel: pci 0000:03:00.0: ROM [mem 0xfff80000-0xffffffff pref]: can't claim; no compatible bridge window Jan 15 01:14:33.067346 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Jan 15 01:14:33.067448 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Jan 15 01:14:33.067547 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Jan 15 01:14:33.067645 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Jan 15 01:14:33.067742 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Jan 15 01:14:33.067841 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Jan 15 01:14:33.067941 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Jan 15 01:14:33.068038 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 Jan 15 01:14:33.068135 kernel: pci 0000:00:03.1: bridge window [io 0x1000-0x0fff] to [bus 0b] add_size 1000 Jan 15 01:14:33.068248 kernel: pci 0000:00:03.2: bridge window [io 0x1000-0x0fff] to [bus 0c] add_size 1000 Jan 15 01:14:33.068361 kernel: pci 0000:00:03.3: bridge window [io 0x1000-0x0fff] to [bus 0d] add_size 1000 Jan 15 01:14:33.068460 kernel: pci 0000:00:03.4: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 Jan 15 01:14:33.068649 kernel: pci 0000:00:03.5: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 Jan 15 01:14:33.068751 kernel: pci 0000:00:03.6: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 Jan 15 01:14:33.068862 kernel: pci 0000:00:03.7: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 Jan 15 01:14:33.068962 kernel: pci 0000:00:04.0: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 Jan 15 01:14:33.070869 kernel: pci 0000:00:04.1: bridge window [io 0x1000-0x0fff] to [bus 13] add_size 1000 Jan 15 01:14:33.070998 kernel: pci 0000:00:04.2: bridge window [io 0x1000-0x0fff] to [bus 14] add_size 1000 Jan 15 01:14:33.071103 kernel: pci 0000:00:04.3: bridge window [io 0x1000-0x0fff] to [bus 15] add_size 1000 Jan 15 01:14:33.071207 kernel: pci 0000:00:04.4: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 Jan 15 01:14:33.071317 kernel: pci 0000:00:04.5: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 Jan 15 01:14:33.071416 kernel: pci 0000:00:04.6: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 Jan 15 01:14:33.071515 kernel: pci 0000:00:04.7: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 Jan 15 01:14:33.071610 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 Jan 15 01:14:33.071707 kernel: pci 0000:00:05.1: bridge window [io 0x1000-0x0fff] to [bus 1b] add_size 1000 Jan 15 01:14:33.071806 kernel: pci 0000:00:05.2: bridge window [io 0x1000-0x0fff] to [bus 1c] add_size 1000 Jan 15 01:14:33.071906 kernel: pci 0000:00:05.3: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 Jan 15 01:14:33.072003 kernel: pci 0000:00:05.4: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 Jan 15 01:14:33.072100 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x1fff]: assigned Jan 15 01:14:33.072196 kernel: pci 0000:00:02.2: bridge window [io 0x2000-0x2fff]: assigned Jan 15 01:14:33.072365 kernel: pci 0000:00:02.3: bridge window [io 0x3000-0x3fff]: assigned Jan 15 01:14:33.072472 kernel: pci 0000:00:02.4: bridge window [io 0x4000-0x4fff]: assigned Jan 15 01:14:33.072575 kernel: pci 0000:00:02.5: bridge window [io 0x5000-0x5fff]: assigned Jan 15 01:14:33.072673 kernel: pci 0000:00:02.6: bridge window [io 0x8000-0x8fff]: assigned Jan 15 01:14:33.072769 kernel: pci 0000:00:02.7: bridge window [io 0x9000-0x9fff]: assigned Jan 15 01:14:33.072875 kernel: pci 0000:00:03.0: bridge window [io 0xa000-0xafff]: assigned Jan 15 01:14:33.072974 kernel: pci 0000:00:03.1: bridge window [io 0xb000-0xbfff]: assigned Jan 15 01:14:33.073071 kernel: pci 0000:00:03.2: bridge window [io 0xc000-0xcfff]: assigned Jan 15 01:14:33.073813 kernel: pci 0000:00:03.3: bridge window [io 0xd000-0xdfff]: assigned Jan 15 01:14:33.073913 kernel: pci 0000:00:03.4: bridge window [io 0xe000-0xefff]: assigned Jan 15 01:14:33.074010 kernel: pci 0000:00:03.5: bridge window [io 0xf000-0xffff]: assigned Jan 15 01:14:33.074108 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: can't assign; no space Jan 15 01:14:33.074205 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: failed to assign Jan 15 01:14:33.074312 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: can't assign; no space Jan 15 01:14:33.074411 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: failed to assign Jan 15 01:14:33.074507 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: can't assign; no space Jan 15 01:14:33.074603 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: failed to assign Jan 15 01:14:33.074759 kernel: pci 0000:00:04.1: bridge window [io size 0x1000]: can't assign; no space Jan 15 01:14:33.074855 kernel: pci 0000:00:04.1: bridge window [io size 0x1000]: failed to assign Jan 15 01:14:33.074951 kernel: pci 0000:00:04.2: bridge window [io size 0x1000]: can't assign; no space Jan 15 01:14:33.075046 kernel: pci 0000:00:04.2: bridge window [io size 0x1000]: failed to assign Jan 15 01:14:33.075146 kernel: pci 0000:00:04.3: bridge window [io size 0x1000]: can't assign; no space Jan 15 01:14:33.075241 kernel: pci 0000:00:04.3: bridge window [io size 0x1000]: failed to assign Jan 15 01:14:33.077384 kernel: pci 0000:00:04.4: bridge window [io size 0x1000]: can't assign; no space Jan 15 01:14:33.077494 kernel: pci 0000:00:04.4: bridge window [io size 0x1000]: failed to assign Jan 15 01:14:33.077598 kernel: pci 0000:00:04.5: bridge window [io size 0x1000]: can't assign; no space Jan 15 01:14:33.077695 kernel: pci 0000:00:04.5: bridge window [io size 0x1000]: failed to assign Jan 15 01:14:33.077795 kernel: pci 0000:00:04.6: bridge window [io size 0x1000]: can't assign; no space Jan 15 01:14:33.077891 kernel: pci 0000:00:04.6: bridge window [io size 0x1000]: failed to assign Jan 15 01:14:33.077986 kernel: pci 0000:00:04.7: bridge window [io size 0x1000]: can't assign; no space Jan 15 01:14:33.078081 kernel: pci 0000:00:04.7: bridge window [io size 0x1000]: failed to assign Jan 15 01:14:33.078178 kernel: pci 0000:00:05.0: bridge window [io size 0x1000]: can't assign; no space Jan 15 01:14:33.079384 kernel: pci 0000:00:05.0: bridge window [io size 0x1000]: failed to assign Jan 15 01:14:33.079502 kernel: pci 0000:00:05.1: bridge window [io size 0x1000]: can't assign; no space Jan 15 01:14:33.079605 kernel: pci 0000:00:05.1: bridge window [io size 0x1000]: failed to assign Jan 15 01:14:33.079703 kernel: pci 0000:00:05.2: bridge window [io size 0x1000]: can't assign; no space Jan 15 01:14:33.079799 kernel: pci 0000:00:05.2: bridge window [io size 0x1000]: failed to assign Jan 15 01:14:33.079896 kernel: pci 0000:00:05.3: bridge window [io size 0x1000]: can't assign; no space Jan 15 01:14:33.080002 kernel: pci 0000:00:05.3: bridge window [io size 0x1000]: failed to assign Jan 15 01:14:33.080102 kernel: pci 0000:00:05.4: bridge window [io size 0x1000]: can't assign; no space Jan 15 01:14:33.080201 kernel: pci 0000:00:05.4: bridge window [io size 0x1000]: failed to assign Jan 15 01:14:33.080346 kernel: pci 0000:00:05.4: bridge window [io 0x1000-0x1fff]: assigned Jan 15 01:14:33.080445 kernel: pci 0000:00:05.3: bridge window [io 0x2000-0x2fff]: assigned Jan 15 01:14:33.080541 kernel: pci 0000:00:05.2: bridge window [io 0x3000-0x3fff]: assigned Jan 15 01:14:33.080636 kernel: pci 0000:00:05.1: bridge window [io 0x4000-0x4fff]: assigned Jan 15 01:14:33.080732 kernel: pci 0000:00:05.0: bridge window [io 0x5000-0x5fff]: assigned Jan 15 01:14:33.080844 kernel: pci 0000:00:04.7: bridge window [io 0x8000-0x8fff]: assigned Jan 15 01:14:33.080943 kernel: pci 0000:00:04.6: bridge window [io 0x9000-0x9fff]: assigned Jan 15 01:14:33.081039 kernel: pci 0000:00:04.5: bridge window [io 0xa000-0xafff]: assigned Jan 15 01:14:33.081135 kernel: pci 0000:00:04.4: bridge window [io 0xb000-0xbfff]: assigned Jan 15 01:14:33.081232 kernel: pci 0000:00:04.3: bridge window [io 0xc000-0xcfff]: assigned Jan 15 01:14:33.081348 kernel: pci 0000:00:04.2: bridge window [io 0xd000-0xdfff]: assigned Jan 15 01:14:33.081446 kernel: pci 0000:00:04.1: bridge window [io 0xe000-0xefff]: assigned Jan 15 01:14:33.081546 kernel: pci 0000:00:04.0: bridge window [io 0xf000-0xffff]: assigned Jan 15 01:14:33.081643 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: can't assign; no space Jan 15 01:14:33.081738 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: failed to assign Jan 15 01:14:33.081834 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: can't assign; no space Jan 15 01:14:33.081929 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: failed to assign Jan 15 01:14:33.082025 kernel: pci 0000:00:03.5: bridge window [io size 0x1000]: can't assign; no space Jan 15 01:14:33.082121 kernel: pci 0000:00:03.5: bridge window [io size 0x1000]: failed to assign Jan 15 01:14:33.082219 kernel: pci 0000:00:03.4: bridge window [io size 0x1000]: can't assign; no space Jan 15 01:14:33.084360 kernel: pci 0000:00:03.4: bridge window [io size 0x1000]: failed to assign Jan 15 01:14:33.084484 kernel: pci 0000:00:03.3: bridge window [io size 0x1000]: can't assign; no space Jan 15 01:14:33.084585 kernel: pci 0000:00:03.3: bridge window [io size 0x1000]: failed to assign Jan 15 01:14:33.084683 kernel: pci 0000:00:03.2: bridge window [io size 0x1000]: can't assign; no space Jan 15 01:14:33.084779 kernel: pci 0000:00:03.2: bridge window [io size 0x1000]: failed to assign Jan 15 01:14:33.084894 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: can't assign; no space Jan 15 01:14:33.084990 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: failed to assign Jan 15 01:14:33.085087 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: can't assign; no space Jan 15 01:14:33.085185 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: failed to assign Jan 15 01:14:33.086455 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: can't assign; no space Jan 15 01:14:33.086563 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: failed to assign Jan 15 01:14:33.086661 kernel: pci 0000:00:02.6: bridge window [io size 0x1000]: can't assign; no space Jan 15 01:14:33.086762 kernel: pci 0000:00:02.6: bridge window [io size 0x1000]: failed to assign Jan 15 01:14:33.086859 kernel: pci 0000:00:02.5: bridge window [io size 0x1000]: can't assign; no space Jan 15 01:14:33.086954 kernel: pci 0000:00:02.5: bridge window [io size 0x1000]: failed to assign Jan 15 01:14:33.087053 kernel: pci 0000:00:02.4: bridge window [io size 0x1000]: can't assign; no space Jan 15 01:14:33.087148 kernel: pci 0000:00:02.4: bridge window [io size 0x1000]: failed to assign Jan 15 01:14:33.087245 kernel: pci 0000:00:02.3: bridge window [io size 0x1000]: can't assign; no space Jan 15 01:14:33.088092 kernel: pci 0000:00:02.3: bridge window [io size 0x1000]: failed to assign Jan 15 01:14:33.088310 kernel: pci 0000:00:02.2: bridge window [io size 0x1000]: can't assign; no space Jan 15 01:14:33.088412 kernel: pci 0000:00:02.2: bridge window [io size 0x1000]: failed to assign Jan 15 01:14:33.088511 kernel: pci 0000:00:02.1: bridge window [io size 0x1000]: can't assign; no space Jan 15 01:14:33.088608 kernel: pci 0000:00:02.1: bridge window [io size 0x1000]: failed to assign Jan 15 01:14:33.088711 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Jan 15 01:14:33.088814 kernel: pci 0000:01:00.0: bridge window [io 0x6000-0x6fff] Jan 15 01:14:33.088922 kernel: pci 0000:01:00.0: bridge window [mem 0x84000000-0x841fffff] Jan 15 01:14:33.089020 kernel: pci 0000:01:00.0: bridge window [mem 0x380000000000-0x3807ffffffff 64bit pref] Jan 15 01:14:33.089117 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Jan 15 01:14:33.089213 kernel: pci 0000:00:02.0: bridge window [io 0x6000-0x6fff] Jan 15 01:14:33.089316 kernel: pci 0000:00:02.0: bridge window [mem 0x84000000-0x842fffff] Jan 15 01:14:33.089412 kernel: pci 0000:00:02.0: bridge window [mem 0x380000000000-0x3807ffffffff 64bit pref] Jan 15 01:14:33.089514 kernel: pci 0000:03:00.0: ROM [mem 0x83e80000-0x83efffff pref]: assigned Jan 15 01:14:33.089614 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Jan 15 01:14:33.089708 kernel: pci 0000:00:02.1: bridge window [mem 0x83e00000-0x83ffffff] Jan 15 01:14:33.089803 kernel: pci 0000:00:02.1: bridge window [mem 0x380800000000-0x380fffffffff 64bit pref] Jan 15 01:14:33.089898 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Jan 15 01:14:33.089992 kernel: pci 0000:00:02.2: bridge window [mem 0x83c00000-0x83dfffff] Jan 15 01:14:33.090088 kernel: pci 0000:00:02.2: bridge window [mem 0x381000000000-0x3817ffffffff 64bit pref] Jan 15 01:14:33.090183 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Jan 15 01:14:33.090281 kernel: pci 0000:00:02.3: bridge window [mem 0x83a00000-0x83bfffff] Jan 15 01:14:33.090997 kernel: pci 0000:00:02.3: bridge window [mem 0x381800000000-0x381fffffffff 64bit pref] Jan 15 01:14:33.091100 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Jan 15 01:14:33.091196 kernel: pci 0000:00:02.4: bridge window [mem 0x83800000-0x839fffff] Jan 15 01:14:33.091312 kernel: pci 0000:00:02.4: bridge window [mem 0x382000000000-0x3827ffffffff 64bit pref] Jan 15 01:14:33.091412 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Jan 15 01:14:33.091508 kernel: pci 0000:00:02.5: bridge window [mem 0x83600000-0x837fffff] Jan 15 01:14:33.091603 kernel: pci 0000:00:02.5: bridge window [mem 0x382800000000-0x382fffffffff 64bit pref] Jan 15 01:14:33.091706 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Jan 15 01:14:33.091813 kernel: pci 0000:00:02.6: bridge window [mem 0x83400000-0x835fffff] Jan 15 01:14:33.091911 kernel: pci 0000:00:02.6: bridge window [mem 0x383000000000-0x3837ffffffff 64bit pref] Jan 15 01:14:33.092006 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Jan 15 01:14:33.092102 kernel: pci 0000:00:02.7: bridge window [mem 0x83200000-0x833fffff] Jan 15 01:14:33.092199 kernel: pci 0000:00:02.7: bridge window [mem 0x383800000000-0x383fffffffff 64bit pref] Jan 15 01:14:33.092304 kernel: pci 0000:00:03.0: PCI bridge to [bus 0a] Jan 15 01:14:33.092400 kernel: pci 0000:00:03.0: bridge window [mem 0x83000000-0x831fffff] Jan 15 01:14:33.092495 kernel: pci 0000:00:03.0: bridge window [mem 0x384000000000-0x3847ffffffff 64bit pref] Jan 15 01:14:33.092591 kernel: pci 0000:00:03.1: PCI bridge to [bus 0b] Jan 15 01:14:33.092685 kernel: pci 0000:00:03.1: bridge window [mem 0x82e00000-0x82ffffff] Jan 15 01:14:33.092779 kernel: pci 0000:00:03.1: bridge window [mem 0x384800000000-0x384fffffffff 64bit pref] Jan 15 01:14:33.092889 kernel: pci 0000:00:03.2: PCI bridge to [bus 0c] Jan 15 01:14:33.092984 kernel: pci 0000:00:03.2: bridge window [mem 0x82c00000-0x82dfffff] Jan 15 01:14:33.093078 kernel: pci 0000:00:03.2: bridge window [mem 0x385000000000-0x3857ffffffff 64bit pref] Jan 15 01:14:33.093174 kernel: pci 0000:00:03.3: PCI bridge to [bus 0d] Jan 15 01:14:33.093270 kernel: pci 0000:00:03.3: bridge window [mem 0x82a00000-0x82bfffff] Jan 15 01:14:33.093990 kernel: pci 0000:00:03.3: bridge window [mem 0x385800000000-0x385fffffffff 64bit pref] Jan 15 01:14:33.094101 kernel: pci 0000:00:03.4: PCI bridge to [bus 0e] Jan 15 01:14:33.094230 kernel: pci 0000:00:03.4: bridge window [mem 0x82800000-0x829fffff] Jan 15 01:14:33.095391 kernel: pci 0000:00:03.4: bridge window [mem 0x386000000000-0x3867ffffffff 64bit pref] Jan 15 01:14:33.095503 kernel: pci 0000:00:03.5: PCI bridge to [bus 0f] Jan 15 01:14:33.095602 kernel: pci 0000:00:03.5: bridge window [mem 0x82600000-0x827fffff] Jan 15 01:14:33.095699 kernel: pci 0000:00:03.5: bridge window [mem 0x386800000000-0x386fffffffff 64bit pref] Jan 15 01:14:33.095799 kernel: pci 0000:00:03.6: PCI bridge to [bus 10] Jan 15 01:14:33.095894 kernel: pci 0000:00:03.6: bridge window [mem 0x82400000-0x825fffff] Jan 15 01:14:33.095991 kernel: pci 0000:00:03.6: bridge window [mem 0x387000000000-0x3877ffffffff 64bit pref] Jan 15 01:14:33.096092 kernel: pci 0000:00:03.7: PCI bridge to [bus 11] Jan 15 01:14:33.096188 kernel: pci 0000:00:03.7: bridge window [mem 0x82200000-0x823fffff] Jan 15 01:14:33.096282 kernel: pci 0000:00:03.7: bridge window [mem 0x387800000000-0x387fffffffff 64bit pref] Jan 15 01:14:33.096390 kernel: pci 0000:00:04.0: PCI bridge to [bus 12] Jan 15 01:14:33.096486 kernel: pci 0000:00:04.0: bridge window [io 0xf000-0xffff] Jan 15 01:14:33.096582 kernel: pci 0000:00:04.0: bridge window [mem 0x82000000-0x821fffff] Jan 15 01:14:33.096677 kernel: pci 0000:00:04.0: bridge window [mem 0x388000000000-0x3887ffffffff 64bit pref] Jan 15 01:14:33.096780 kernel: pci 0000:00:04.1: PCI bridge to [bus 13] Jan 15 01:14:33.096888 kernel: pci 0000:00:04.1: bridge window [io 0xe000-0xefff] Jan 15 01:14:33.096984 kernel: pci 0000:00:04.1: bridge window [mem 0x81e00000-0x81ffffff] Jan 15 01:14:33.097088 kernel: pci 0000:00:04.1: bridge window [mem 0x388800000000-0x388fffffffff 64bit pref] Jan 15 01:14:33.097187 kernel: pci 0000:00:04.2: PCI bridge to [bus 14] Jan 15 01:14:33.097285 kernel: pci 0000:00:04.2: bridge window [io 0xd000-0xdfff] Jan 15 01:14:33.099396 kernel: pci 0000:00:04.2: bridge window [mem 0x81c00000-0x81dfffff] Jan 15 01:14:33.099499 kernel: pci 0000:00:04.2: bridge window [mem 0x389000000000-0x3897ffffffff 64bit pref] Jan 15 01:14:33.099600 kernel: pci 0000:00:04.3: PCI bridge to [bus 15] Jan 15 01:14:33.099695 kernel: pci 0000:00:04.3: bridge window [io 0xc000-0xcfff] Jan 15 01:14:33.099791 kernel: pci 0000:00:04.3: bridge window [mem 0x81a00000-0x81bfffff] Jan 15 01:14:33.099886 kernel: pci 0000:00:04.3: bridge window [mem 0x389800000000-0x389fffffffff 64bit pref] Jan 15 01:14:33.099988 kernel: pci 0000:00:04.4: PCI bridge to [bus 16] Jan 15 01:14:33.100083 kernel: pci 0000:00:04.4: bridge window [io 0xb000-0xbfff] Jan 15 01:14:33.100179 kernel: pci 0000:00:04.4: bridge window [mem 0x81800000-0x819fffff] Jan 15 01:14:33.100274 kernel: pci 0000:00:04.4: bridge window [mem 0x38a000000000-0x38a7ffffffff 64bit pref] Jan 15 01:14:33.100983 kernel: pci 0000:00:04.5: PCI bridge to [bus 17] Jan 15 01:14:33.101086 kernel: pci 0000:00:04.5: bridge window [io 0xa000-0xafff] Jan 15 01:14:33.101186 kernel: pci 0000:00:04.5: bridge window [mem 0x81600000-0x817fffff] Jan 15 01:14:33.101282 kernel: pci 0000:00:04.5: bridge window [mem 0x38a800000000-0x38afffffffff 64bit pref] Jan 15 01:14:33.102499 kernel: pci 0000:00:04.6: PCI bridge to [bus 18] Jan 15 01:14:33.102609 kernel: pci 0000:00:04.6: bridge window [io 0x9000-0x9fff] Jan 15 01:14:33.102708 kernel: pci 0000:00:04.6: bridge window [mem 0x81400000-0x815fffff] Jan 15 01:14:33.102804 kernel: pci 0000:00:04.6: bridge window [mem 0x38b000000000-0x38b7ffffffff 64bit pref] Jan 15 01:14:33.102908 kernel: pci 0000:00:04.7: PCI bridge to [bus 19] Jan 15 01:14:33.103004 kernel: pci 0000:00:04.7: bridge window [io 0x8000-0x8fff] Jan 15 01:14:33.103099 kernel: pci 0000:00:04.7: bridge window [mem 0x81200000-0x813fffff] Jan 15 01:14:33.103194 kernel: pci 0000:00:04.7: bridge window [mem 0x38b800000000-0x38bfffffffff 64bit pref] Jan 15 01:14:33.103300 kernel: pci 0000:00:05.0: PCI bridge to [bus 1a] Jan 15 01:14:33.103397 kernel: pci 0000:00:05.0: bridge window [io 0x5000-0x5fff] Jan 15 01:14:33.103495 kernel: pci 0000:00:05.0: bridge window [mem 0x81000000-0x811fffff] Jan 15 01:14:33.103590 kernel: pci 0000:00:05.0: bridge window [mem 0x38c000000000-0x38c7ffffffff 64bit pref] Jan 15 01:14:33.103688 kernel: pci 0000:00:05.1: PCI bridge to [bus 1b] Jan 15 01:14:33.103782 kernel: pci 0000:00:05.1: bridge window [io 0x4000-0x4fff] Jan 15 01:14:33.103877 kernel: pci 0000:00:05.1: bridge window [mem 0x80e00000-0x80ffffff] Jan 15 01:14:33.103972 kernel: pci 0000:00:05.1: bridge window [mem 0x38c800000000-0x38cfffffffff 64bit pref] Jan 15 01:14:33.104071 kernel: pci 0000:00:05.2: PCI bridge to [bus 1c] Jan 15 01:14:33.104166 kernel: pci 0000:00:05.2: bridge window [io 0x3000-0x3fff] Jan 15 01:14:33.104260 kernel: pci 0000:00:05.2: bridge window [mem 0x80c00000-0x80dfffff] Jan 15 01:14:33.104362 kernel: pci 0000:00:05.2: bridge window [mem 0x38d000000000-0x38d7ffffffff 64bit pref] Jan 15 01:14:33.104459 kernel: pci 0000:00:05.3: PCI bridge to [bus 1d] Jan 15 01:14:33.104555 kernel: pci 0000:00:05.3: bridge window [io 0x2000-0x2fff] Jan 15 01:14:33.104652 kernel: pci 0000:00:05.3: bridge window [mem 0x80a00000-0x80bfffff] Jan 15 01:14:33.104748 kernel: pci 0000:00:05.3: bridge window [mem 0x38d800000000-0x38dfffffffff 64bit pref] Jan 15 01:14:33.104879 kernel: pci 0000:00:05.4: PCI bridge to [bus 1e] Jan 15 01:14:33.104974 kernel: pci 0000:00:05.4: bridge window [io 0x1000-0x1fff] Jan 15 01:14:33.105070 kernel: pci 0000:00:05.4: bridge window [mem 0x80800000-0x809fffff] Jan 15 01:14:33.105166 kernel: pci 0000:00:05.4: bridge window [mem 0x38e000000000-0x38e7ffffffff 64bit pref] Jan 15 01:14:33.105262 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Jan 15 01:14:33.105362 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Jan 15 01:14:33.105448 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Jan 15 01:14:33.105534 kernel: pci_bus 0000:00: resource 7 [mem 0x80000000-0xdfffffff window] Jan 15 01:14:33.105619 kernel: pci_bus 0000:00: resource 8 [mem 0xf0000000-0xfebfffff window] Jan 15 01:14:33.105705 kernel: pci_bus 0000:00: resource 9 [mem 0x380000000000-0x38e800003fff window] Jan 15 01:14:33.105806 kernel: pci_bus 0000:01: resource 0 [io 0x6000-0x6fff] Jan 15 01:14:33.105897 kernel: pci_bus 0000:01: resource 1 [mem 0x84000000-0x842fffff] Jan 15 01:14:33.105985 kernel: pci_bus 0000:01: resource 2 [mem 0x380000000000-0x3807ffffffff 64bit pref] Jan 15 01:14:33.106084 kernel: pci_bus 0000:02: resource 0 [io 0x6000-0x6fff] Jan 15 01:14:33.106176 kernel: pci_bus 0000:02: resource 1 [mem 0x84000000-0x841fffff] Jan 15 01:14:33.106269 kernel: pci_bus 0000:02: resource 2 [mem 0x380000000000-0x3807ffffffff 64bit pref] Jan 15 01:14:33.106383 kernel: pci_bus 0000:03: resource 1 [mem 0x83e00000-0x83ffffff] Jan 15 01:14:33.106473 kernel: pci_bus 0000:03: resource 2 [mem 0x380800000000-0x380fffffffff 64bit pref] Jan 15 01:14:33.106569 kernel: pci_bus 0000:04: resource 1 [mem 0x83c00000-0x83dfffff] Jan 15 01:14:33.106658 kernel: pci_bus 0000:04: resource 2 [mem 0x381000000000-0x3817ffffffff 64bit pref] Jan 15 01:14:33.106754 kernel: pci_bus 0000:05: resource 1 [mem 0x83a00000-0x83bfffff] Jan 15 01:14:33.106847 kernel: pci_bus 0000:05: resource 2 [mem 0x381800000000-0x381fffffffff 64bit pref] Jan 15 01:14:33.106944 kernel: pci_bus 0000:06: resource 1 [mem 0x83800000-0x839fffff] Jan 15 01:14:33.107034 kernel: pci_bus 0000:06: resource 2 [mem 0x382000000000-0x3827ffffffff 64bit pref] Jan 15 01:14:33.107129 kernel: pci_bus 0000:07: resource 1 [mem 0x83600000-0x837fffff] Jan 15 01:14:33.107219 kernel: pci_bus 0000:07: resource 2 [mem 0x382800000000-0x382fffffffff 64bit pref] Jan 15 01:14:33.107339 kernel: pci_bus 0000:08: resource 1 [mem 0x83400000-0x835fffff] Jan 15 01:14:33.107431 kernel: pci_bus 0000:08: resource 2 [mem 0x383000000000-0x3837ffffffff 64bit pref] Jan 15 01:14:33.107529 kernel: pci_bus 0000:09: resource 1 [mem 0x83200000-0x833fffff] Jan 15 01:14:33.107618 kernel: pci_bus 0000:09: resource 2 [mem 0x383800000000-0x383fffffffff 64bit pref] Jan 15 01:14:33.107713 kernel: pci_bus 0000:0a: resource 1 [mem 0x83000000-0x831fffff] Jan 15 01:14:33.107806 kernel: pci_bus 0000:0a: resource 2 [mem 0x384000000000-0x3847ffffffff 64bit pref] Jan 15 01:14:33.107902 kernel: pci_bus 0000:0b: resource 1 [mem 0x82e00000-0x82ffffff] Jan 15 01:14:33.107991 kernel: pci_bus 0000:0b: resource 2 [mem 0x384800000000-0x384fffffffff 64bit pref] Jan 15 01:14:33.108086 kernel: pci_bus 0000:0c: resource 1 [mem 0x82c00000-0x82dfffff] Jan 15 01:14:33.108177 kernel: pci_bus 0000:0c: resource 2 [mem 0x385000000000-0x3857ffffffff 64bit pref] Jan 15 01:14:33.108275 kernel: pci_bus 0000:0d: resource 1 [mem 0x82a00000-0x82bfffff] Jan 15 01:14:33.108377 kernel: pci_bus 0000:0d: resource 2 [mem 0x385800000000-0x385fffffffff 64bit pref] Jan 15 01:14:33.108472 kernel: pci_bus 0000:0e: resource 1 [mem 0x82800000-0x829fffff] Jan 15 01:14:33.108561 kernel: pci_bus 0000:0e: resource 2 [mem 0x386000000000-0x3867ffffffff 64bit pref] Jan 15 01:14:33.108658 kernel: pci_bus 0000:0f: resource 1 [mem 0x82600000-0x827fffff] Jan 15 01:14:33.108751 kernel: pci_bus 0000:0f: resource 2 [mem 0x386800000000-0x386fffffffff 64bit pref] Jan 15 01:14:33.108854 kernel: pci_bus 0000:10: resource 1 [mem 0x82400000-0x825fffff] Jan 15 01:14:33.108946 kernel: pci_bus 0000:10: resource 2 [mem 0x387000000000-0x3877ffffffff 64bit pref] Jan 15 01:14:33.109041 kernel: pci_bus 0000:11: resource 1 [mem 0x82200000-0x823fffff] Jan 15 01:14:33.109130 kernel: pci_bus 0000:11: resource 2 [mem 0x387800000000-0x387fffffffff 64bit pref] Jan 15 01:14:33.109229 kernel: pci_bus 0000:12: resource 0 [io 0xf000-0xffff] Jan 15 01:14:33.109325 kernel: pci_bus 0000:12: resource 1 [mem 0x82000000-0x821fffff] Jan 15 01:14:33.109413 kernel: pci_bus 0000:12: resource 2 [mem 0x388000000000-0x3887ffffffff 64bit pref] Jan 15 01:14:33.109508 kernel: pci_bus 0000:13: resource 0 [io 0xe000-0xefff] Jan 15 01:14:33.109597 kernel: pci_bus 0000:13: resource 1 [mem 0x81e00000-0x81ffffff] Jan 15 01:14:33.109685 kernel: pci_bus 0000:13: resource 2 [mem 0x388800000000-0x388fffffffff 64bit pref] Jan 15 01:14:33.109781 kernel: pci_bus 0000:14: resource 0 [io 0xd000-0xdfff] Jan 15 01:14:33.109870 kernel: pci_bus 0000:14: resource 1 [mem 0x81c00000-0x81dfffff] Jan 15 01:14:33.109958 kernel: pci_bus 0000:14: resource 2 [mem 0x389000000000-0x3897ffffffff 64bit pref] Jan 15 01:14:33.110054 kernel: pci_bus 0000:15: resource 0 [io 0xc000-0xcfff] Jan 15 01:14:33.110144 kernel: pci_bus 0000:15: resource 1 [mem 0x81a00000-0x81bfffff] Jan 15 01:14:33.110239 kernel: pci_bus 0000:15: resource 2 [mem 0x389800000000-0x389fffffffff 64bit pref] Jan 15 01:14:33.110347 kernel: pci_bus 0000:16: resource 0 [io 0xb000-0xbfff] Jan 15 01:14:33.110436 kernel: pci_bus 0000:16: resource 1 [mem 0x81800000-0x819fffff] Jan 15 01:14:33.110524 kernel: pci_bus 0000:16: resource 2 [mem 0x38a000000000-0x38a7ffffffff 64bit pref] Jan 15 01:14:33.110619 kernel: pci_bus 0000:17: resource 0 [io 0xa000-0xafff] Jan 15 01:14:33.110708 kernel: pci_bus 0000:17: resource 1 [mem 0x81600000-0x817fffff] Jan 15 01:14:33.110798 kernel: pci_bus 0000:17: resource 2 [mem 0x38a800000000-0x38afffffffff 64bit pref] Jan 15 01:14:33.110892 kernel: pci_bus 0000:18: resource 0 [io 0x9000-0x9fff] Jan 15 01:14:33.110980 kernel: pci_bus 0000:18: resource 1 [mem 0x81400000-0x815fffff] Jan 15 01:14:33.111068 kernel: pci_bus 0000:18: resource 2 [mem 0x38b000000000-0x38b7ffffffff 64bit pref] Jan 15 01:14:33.111161 kernel: pci_bus 0000:19: resource 0 [io 0x8000-0x8fff] Jan 15 01:14:33.111249 kernel: pci_bus 0000:19: resource 1 [mem 0x81200000-0x813fffff] Jan 15 01:14:33.111349 kernel: pci_bus 0000:19: resource 2 [mem 0x38b800000000-0x38bfffffffff 64bit pref] Jan 15 01:14:33.111443 kernel: pci_bus 0000:1a: resource 0 [io 0x5000-0x5fff] Jan 15 01:14:33.111531 kernel: pci_bus 0000:1a: resource 1 [mem 0x81000000-0x811fffff] Jan 15 01:14:33.111620 kernel: pci_bus 0000:1a: resource 2 [mem 0x38c000000000-0x38c7ffffffff 64bit pref] Jan 15 01:14:33.111713 kernel: pci_bus 0000:1b: resource 0 [io 0x4000-0x4fff] Jan 15 01:14:33.111811 kernel: pci_bus 0000:1b: resource 1 [mem 0x80e00000-0x80ffffff] Jan 15 01:14:33.111901 kernel: pci_bus 0000:1b: resource 2 [mem 0x38c800000000-0x38cfffffffff 64bit pref] Jan 15 01:14:33.111995 kernel: pci_bus 0000:1c: resource 0 [io 0x3000-0x3fff] Jan 15 01:14:33.112085 kernel: pci_bus 0000:1c: resource 1 [mem 0x80c00000-0x80dfffff] Jan 15 01:14:33.112174 kernel: pci_bus 0000:1c: resource 2 [mem 0x38d000000000-0x38d7ffffffff 64bit pref] Jan 15 01:14:33.112270 kernel: pci_bus 0000:1d: resource 0 [io 0x2000-0x2fff] Jan 15 01:14:33.112374 kernel: pci_bus 0000:1d: resource 1 [mem 0x80a00000-0x80bfffff] Jan 15 01:14:33.112463 kernel: pci_bus 0000:1d: resource 2 [mem 0x38d800000000-0x38dfffffffff 64bit pref] Jan 15 01:14:33.112557 kernel: pci_bus 0000:1e: resource 0 [io 0x1000-0x1fff] Jan 15 01:14:33.112647 kernel: pci_bus 0000:1e: resource 1 [mem 0x80800000-0x809fffff] Jan 15 01:14:33.112735 kernel: pci_bus 0000:1e: resource 2 [mem 0x38e000000000-0x38e7ffffffff 64bit pref] Jan 15 01:14:33.112748 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Jan 15 01:14:33.112759 kernel: PCI: CLS 0 bytes, default 64 Jan 15 01:14:33.112769 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Jan 15 01:14:33.112778 kernel: software IO TLB: mapped [mem 0x0000000077ede000-0x000000007bede000] (64MB) Jan 15 01:14:33.112786 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Jan 15 01:14:33.112795 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x21134f58f0d, max_idle_ns: 440795217993 ns Jan 15 01:14:33.112804 kernel: Initialise system trusted keyrings Jan 15 01:14:33.112813 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Jan 15 01:14:33.112824 kernel: Key type asymmetric registered Jan 15 01:14:33.112840 kernel: Asymmetric key parser 'x509' registered Jan 15 01:14:33.112848 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Jan 15 01:14:33.112857 kernel: io scheduler mq-deadline registered Jan 15 01:14:33.112866 kernel: io scheduler kyber registered Jan 15 01:14:33.112874 kernel: io scheduler bfq registered Jan 15 01:14:33.112977 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24 Jan 15 01:14:33.113081 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24 Jan 15 01:14:33.113180 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25 Jan 15 01:14:33.113277 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25 Jan 15 01:14:33.118669 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26 Jan 15 01:14:33.118793 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26 Jan 15 01:14:33.118904 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27 Jan 15 01:14:33.119003 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27 Jan 15 01:14:33.119101 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28 Jan 15 01:14:33.119199 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28 Jan 15 01:14:33.119330 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29 Jan 15 01:14:33.119433 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29 Jan 15 01:14:33.119533 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30 Jan 15 01:14:33.119631 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30 Jan 15 01:14:33.119729 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31 Jan 15 01:14:33.119829 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31 Jan 15 01:14:33.119843 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Jan 15 01:14:33.119942 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 32 Jan 15 01:14:33.120040 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 32 Jan 15 01:14:33.120139 kernel: pcieport 0000:00:03.1: PME: Signaling with IRQ 33 Jan 15 01:14:33.120236 kernel: pcieport 0000:00:03.1: AER: enabled with IRQ 33 Jan 15 01:14:33.120361 kernel: pcieport 0000:00:03.2: PME: Signaling with IRQ 34 Jan 15 01:14:33.120459 kernel: pcieport 0000:00:03.2: AER: enabled with IRQ 34 Jan 15 01:14:33.120558 kernel: pcieport 0000:00:03.3: PME: Signaling with IRQ 35 Jan 15 01:14:33.120653 kernel: pcieport 0000:00:03.3: AER: enabled with IRQ 35 Jan 15 01:14:33.120750 kernel: pcieport 0000:00:03.4: PME: Signaling with IRQ 36 Jan 15 01:14:33.120888 kernel: pcieport 0000:00:03.4: AER: enabled with IRQ 36 Jan 15 01:14:33.120988 kernel: pcieport 0000:00:03.5: PME: Signaling with IRQ 37 Jan 15 01:14:33.121086 kernel: pcieport 0000:00:03.5: AER: enabled with IRQ 37 Jan 15 01:14:33.121185 kernel: pcieport 0000:00:03.6: PME: Signaling with IRQ 38 Jan 15 01:14:33.121283 kernel: pcieport 0000:00:03.6: AER: enabled with IRQ 38 Jan 15 01:14:33.121398 kernel: pcieport 0000:00:03.7: PME: Signaling with IRQ 39 Jan 15 01:14:33.121494 kernel: pcieport 0000:00:03.7: AER: enabled with IRQ 39 Jan 15 01:14:33.121506 kernel: ACPI: \_SB_.GSIE: Enabled at IRQ 20 Jan 15 01:14:33.121603 kernel: pcieport 0000:00:04.0: PME: Signaling with IRQ 40 Jan 15 01:14:33.121700 kernel: pcieport 0000:00:04.0: AER: enabled with IRQ 40 Jan 15 01:14:33.121801 kernel: pcieport 0000:00:04.1: PME: Signaling with IRQ 41 Jan 15 01:14:33.121899 kernel: pcieport 0000:00:04.1: AER: enabled with IRQ 41 Jan 15 01:14:33.121999 kernel: pcieport 0000:00:04.2: PME: Signaling with IRQ 42 Jan 15 01:14:33.122097 kernel: pcieport 0000:00:04.2: AER: enabled with IRQ 42 Jan 15 01:14:33.122194 kernel: pcieport 0000:00:04.3: PME: Signaling with IRQ 43 Jan 15 01:14:33.123149 kernel: pcieport 0000:00:04.3: AER: enabled with IRQ 43 Jan 15 01:14:33.123287 kernel: pcieport 0000:00:04.4: PME: Signaling with IRQ 44 Jan 15 01:14:33.123400 kernel: pcieport 0000:00:04.4: AER: enabled with IRQ 44 Jan 15 01:14:33.123505 kernel: pcieport 0000:00:04.5: PME: Signaling with IRQ 45 Jan 15 01:14:33.123604 kernel: pcieport 0000:00:04.5: AER: enabled with IRQ 45 Jan 15 01:14:33.123701 kernel: pcieport 0000:00:04.6: PME: Signaling with IRQ 46 Jan 15 01:14:33.123804 kernel: pcieport 0000:00:04.6: AER: enabled with IRQ 46 Jan 15 01:14:33.123902 kernel: pcieport 0000:00:04.7: PME: Signaling with IRQ 47 Jan 15 01:14:33.123996 kernel: pcieport 0000:00:04.7: AER: enabled with IRQ 47 Jan 15 01:14:33.124008 kernel: ACPI: \_SB_.GSIF: Enabled at IRQ 21 Jan 15 01:14:33.124106 kernel: pcieport 0000:00:05.0: PME: Signaling with IRQ 48 Jan 15 01:14:33.124201 kernel: pcieport 0000:00:05.0: AER: enabled with IRQ 48 Jan 15 01:14:33.124306 kernel: pcieport 0000:00:05.1: PME: Signaling with IRQ 49 Jan 15 01:14:33.124402 kernel: pcieport 0000:00:05.1: AER: enabled with IRQ 49 Jan 15 01:14:33.124499 kernel: pcieport 0000:00:05.2: PME: Signaling with IRQ 50 Jan 15 01:14:33.124594 kernel: pcieport 0000:00:05.2: AER: enabled with IRQ 50 Jan 15 01:14:33.124690 kernel: pcieport 0000:00:05.3: PME: Signaling with IRQ 51 Jan 15 01:14:33.124789 kernel: pcieport 0000:00:05.3: AER: enabled with IRQ 51 Jan 15 01:14:33.124900 kernel: pcieport 0000:00:05.4: PME: Signaling with IRQ 52 Jan 15 01:14:33.124995 kernel: pcieport 0000:00:05.4: AER: enabled with IRQ 52 Jan 15 01:14:33.125006 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Jan 15 01:14:33.125015 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Jan 15 01:14:33.125024 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Jan 15 01:14:33.125036 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Jan 15 01:14:33.125045 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Jan 15 01:14:33.125054 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Jan 15 01:14:33.125159 kernel: rtc_cmos 00:03: RTC can wake from S4 Jan 15 01:14:33.125172 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Jan 15 01:14:33.125262 kernel: rtc_cmos 00:03: registered as rtc0 Jan 15 01:14:33.125363 kernel: rtc_cmos 00:03: setting system clock to 2026-01-15T01:14:31 UTC (1768439671) Jan 15 01:14:33.125459 kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram Jan 15 01:14:33.125470 kernel: intel_pstate: CPU model not supported Jan 15 01:14:33.125479 kernel: efifb: probing for efifb Jan 15 01:14:33.125487 kernel: efifb: framebuffer at 0x80000000, using 4000k, total 4000k Jan 15 01:14:33.125496 kernel: efifb: mode is 1280x800x32, linelength=5120, pages=1 Jan 15 01:14:33.125505 kernel: efifb: scrolling: redraw Jan 15 01:14:33.125513 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 Jan 15 01:14:33.125524 kernel: Console: switching to colour frame buffer device 160x50 Jan 15 01:14:33.125533 kernel: fb0: EFI VGA frame buffer device Jan 15 01:14:33.125542 kernel: pstore: Using crash dump compression: deflate Jan 15 01:14:33.125550 kernel: pstore: Registered efi_pstore as persistent store backend Jan 15 01:14:33.125559 kernel: NET: Registered PF_INET6 protocol family Jan 15 01:14:33.125568 kernel: Segment Routing with IPv6 Jan 15 01:14:33.125576 kernel: In-situ OAM (IOAM) with IPv6 Jan 15 01:14:33.125587 kernel: NET: Registered PF_PACKET protocol family Jan 15 01:14:33.125595 kernel: Key type dns_resolver registered Jan 15 01:14:33.125604 kernel: IPI shorthand broadcast: enabled Jan 15 01:14:33.125613 kernel: sched_clock: Marking stable (2595003435, 156431077)->(2857148320, -105713808) Jan 15 01:14:33.125621 kernel: registered taskstats version 1 Jan 15 01:14:33.125629 kernel: Loading compiled-in X.509 certificates Jan 15 01:14:33.125638 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.65-flatcar: e8b6753a1cbf8103f5806ce5d59781743c62fae9' Jan 15 01:14:33.125648 kernel: Demotion targets for Node 0: null Jan 15 01:14:33.125657 kernel: Key type .fscrypt registered Jan 15 01:14:33.125665 kernel: Key type fscrypt-provisioning registered Jan 15 01:14:33.125673 kernel: ima: No TPM chip found, activating TPM-bypass! Jan 15 01:14:33.125682 kernel: ima: Allocated hash algorithm: sha1 Jan 15 01:14:33.125691 kernel: ima: No architecture policies found Jan 15 01:14:33.125699 kernel: clk: Disabling unused clocks Jan 15 01:14:33.125710 kernel: Freeing unused kernel image (initmem) memory: 15432K Jan 15 01:14:33.125718 kernel: Write protecting the kernel read-only data: 45056k Jan 15 01:14:33.125727 kernel: Freeing unused kernel image (rodata/data gap) memory: 824K Jan 15 01:14:33.125735 kernel: Run /init as init process Jan 15 01:14:33.125743 kernel: with arguments: Jan 15 01:14:33.125752 kernel: /init Jan 15 01:14:33.125761 kernel: with environment: Jan 15 01:14:33.125769 kernel: HOME=/ Jan 15 01:14:33.125780 kernel: TERM=linux Jan 15 01:14:33.125788 kernel: SCSI subsystem initialized Jan 15 01:14:33.125797 kernel: libata version 3.00 loaded. Jan 15 01:14:33.125898 kernel: ahci 0000:00:1f.2: version 3.0 Jan 15 01:14:33.125910 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Jan 15 01:14:33.126006 kernel: ahci 0000:00:1f.2: AHCI vers 0001.0000, 32 command slots, 1.5 Gbps, SATA mode Jan 15 01:14:33.126105 kernel: ahci 0000:00:1f.2: 6/6 ports implemented (port mask 0x3f) Jan 15 01:14:33.126203 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Jan 15 01:14:33.126339 kernel: scsi host0: ahci Jan 15 01:14:33.126446 kernel: scsi host1: ahci Jan 15 01:14:33.126567 kernel: scsi host2: ahci Jan 15 01:14:33.126675 kernel: scsi host3: ahci Jan 15 01:14:33.126784 kernel: scsi host4: ahci Jan 15 01:14:33.126890 kernel: scsi host5: ahci Jan 15 01:14:33.126903 kernel: ata1: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380100 irq 55 lpm-pol 1 Jan 15 01:14:33.126912 kernel: ata2: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380180 irq 55 lpm-pol 1 Jan 15 01:14:33.126920 kernel: ata3: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380200 irq 55 lpm-pol 1 Jan 15 01:14:33.126929 kernel: ata4: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380280 irq 55 lpm-pol 1 Jan 15 01:14:33.126940 kernel: ata5: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380300 irq 55 lpm-pol 1 Jan 15 01:14:33.126949 kernel: ata6: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380380 irq 55 lpm-pol 1 Jan 15 01:14:33.126958 kernel: ata6: SATA link down (SStatus 0 SControl 300) Jan 15 01:14:33.126967 kernel: ata1: SATA link down (SStatus 0 SControl 300) Jan 15 01:14:33.126975 kernel: ata4: SATA link down (SStatus 0 SControl 300) Jan 15 01:14:33.126984 kernel: ata5: SATA link down (SStatus 0 SControl 300) Jan 15 01:14:33.126993 kernel: ata3: SATA link down (SStatus 0 SControl 300) Jan 15 01:14:33.127004 kernel: ata2: SATA link down (SStatus 0 SControl 300) Jan 15 01:14:33.127012 kernel: ACPI: bus type USB registered Jan 15 01:14:33.127021 kernel: usbcore: registered new interface driver usbfs Jan 15 01:14:33.127030 kernel: usbcore: registered new interface driver hub Jan 15 01:14:33.127039 kernel: usbcore: registered new device driver usb Jan 15 01:14:33.127148 kernel: uhci_hcd 0000:02:01.0: UHCI Host Controller Jan 15 01:14:33.127253 kernel: uhci_hcd 0000:02:01.0: new USB bus registered, assigned bus number 1 Jan 15 01:14:33.127369 kernel: uhci_hcd 0000:02:01.0: detected 2 ports Jan 15 01:14:33.127471 kernel: uhci_hcd 0000:02:01.0: irq 22, io port 0x00006000 Jan 15 01:14:33.127600 kernel: hub 1-0:1.0: USB hub found Jan 15 01:14:33.127714 kernel: hub 1-0:1.0: 2 ports detected Jan 15 01:14:33.127826 kernel: virtio_blk virtio2: 2/0/0 default/read/poll queues Jan 15 01:14:33.127929 kernel: virtio_blk virtio2: [vda] 104857600 512-byte logical blocks (53.7 GB/50.0 GiB) Jan 15 01:14:33.127940 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Jan 15 01:14:33.127950 kernel: GPT:25804799 != 104857599 Jan 15 01:14:33.127959 kernel: GPT:Alternate GPT header not at the end of the disk. Jan 15 01:14:33.127967 kernel: GPT:25804799 != 104857599 Jan 15 01:14:33.127976 kernel: GPT: Use GNU Parted to correct GPT errors. Jan 15 01:14:33.127985 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Jan 15 01:14:33.127996 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Jan 15 01:14:33.128005 kernel: device-mapper: uevent: version 1.0.3 Jan 15 01:14:33.128014 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Jan 15 01:14:33.128023 kernel: device-mapper: verity: sha256 using shash "sha256-generic" Jan 15 01:14:33.128032 kernel: raid6: avx512x4 gen() 42194 MB/s Jan 15 01:14:33.128040 kernel: raid6: avx512x2 gen() 44033 MB/s Jan 15 01:14:33.128049 kernel: raid6: avx512x1 gen() 44261 MB/s Jan 15 01:14:33.128060 kernel: raid6: avx2x4 gen() 33808 MB/s Jan 15 01:14:33.128069 kernel: raid6: avx2x2 gen() 32819 MB/s Jan 15 01:14:33.128077 kernel: raid6: avx2x1 gen() 30124 MB/s Jan 15 01:14:33.128086 kernel: raid6: using algorithm avx512x1 gen() 44261 MB/s Jan 15 01:14:33.128096 kernel: raid6: .... xor() 24684 MB/s, rmw enabled Jan 15 01:14:33.128106 kernel: raid6: using avx512x2 recovery algorithm Jan 15 01:14:33.128117 kernel: xor: automatically using best checksumming function avx Jan 15 01:14:33.128126 kernel: Btrfs loaded, zoned=no, fsverity=no Jan 15 01:14:33.128135 kernel: BTRFS: device fsid 1fc5e5ba-2a81-4f9e-b722-a47a3e33c106 devid 1 transid 34 /dev/mapper/usr (253:0) scanned by mount (203) Jan 15 01:14:33.128144 kernel: BTRFS info (device dm-0): first mount of filesystem 1fc5e5ba-2a81-4f9e-b722-a47a3e33c106 Jan 15 01:14:33.128153 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Jan 15 01:14:33.128280 kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd Jan 15 01:14:33.128302 kernel: BTRFS info (device dm-0): disabling log replay at mount time Jan 15 01:14:33.128314 kernel: BTRFS info (device dm-0): enabling free space tree Jan 15 01:14:33.128323 kernel: loop: module loaded Jan 15 01:14:33.128332 kernel: loop0: detected capacity change from 0 to 100160 Jan 15 01:14:33.128341 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Jan 15 01:14:33.128352 systemd[1]: Successfully made /usr/ read-only. Jan 15 01:14:33.128366 systemd[1]: systemd 257.9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +IPE +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -BTF -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jan 15 01:14:33.128378 systemd[1]: Detected virtualization kvm. Jan 15 01:14:33.128387 systemd[1]: Detected architecture x86-64. Jan 15 01:14:33.128396 systemd[1]: Running in initrd. Jan 15 01:14:33.128405 systemd[1]: No hostname configured, using default hostname. Jan 15 01:14:33.128416 systemd[1]: Hostname set to . Jan 15 01:14:33.128424 systemd[1]: Initializing machine ID from SMBIOS/DMI UUID. Jan 15 01:14:33.128435 systemd[1]: Queued start job for default target initrd.target. Jan 15 01:14:33.128444 systemd[1]: Unnecessary job was removed for dev-mapper-usr.device - /dev/mapper/usr. Jan 15 01:14:33.128454 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 15 01:14:33.128463 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 15 01:14:33.128474 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Jan 15 01:14:33.128483 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 15 01:14:33.128495 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Jan 15 01:14:33.128504 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Jan 15 01:14:33.128513 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 15 01:14:33.128522 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 15 01:14:33.128531 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Jan 15 01:14:33.128540 systemd[1]: Reached target paths.target - Path Units. Jan 15 01:14:33.128551 systemd[1]: Reached target slices.target - Slice Units. Jan 15 01:14:33.128561 systemd[1]: Reached target swap.target - Swaps. Jan 15 01:14:33.128570 systemd[1]: Reached target timers.target - Timer Units. Jan 15 01:14:33.128579 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Jan 15 01:14:33.128588 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 15 01:14:33.128597 systemd[1]: Listening on systemd-journald-audit.socket - Journal Audit Socket. Jan 15 01:14:33.128606 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Jan 15 01:14:33.128617 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Jan 15 01:14:33.128626 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 15 01:14:33.128635 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 15 01:14:33.128645 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 15 01:14:33.128654 systemd[1]: Reached target sockets.target - Socket Units. Jan 15 01:14:33.128664 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Jan 15 01:14:33.128673 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Jan 15 01:14:33.128684 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 15 01:14:33.128694 systemd[1]: Finished network-cleanup.service - Network Cleanup. Jan 15 01:14:33.128703 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Jan 15 01:14:33.128713 systemd[1]: Starting systemd-fsck-usr.service... Jan 15 01:14:33.128722 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 15 01:14:33.128731 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 15 01:14:33.128743 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 15 01:14:33.128752 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Jan 15 01:14:33.128761 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 15 01:14:33.128770 systemd[1]: Finished systemd-fsck-usr.service. Jan 15 01:14:33.128810 systemd-journald[341]: Collecting audit messages is enabled. Jan 15 01:14:33.128844 kernel: audit: type=1130 audit(1768439673.026:2): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-usr comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:33.128855 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jan 15 01:14:33.128866 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 15 01:14:33.128876 kernel: audit: type=1130 audit(1768439673.068:3): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:33.128886 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 15 01:14:33.128895 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Jan 15 01:14:33.128904 kernel: Bridge firewalling registered Jan 15 01:14:33.128913 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 15 01:14:33.128924 kernel: audit: type=1130 audit(1768439673.101:4): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:33.128933 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 15 01:14:33.128942 kernel: audit: type=1130 audit(1768439673.106:5): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:33.128952 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 15 01:14:33.128962 kernel: audit: type=1130 audit(1768439673.112:6): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:33.128971 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 15 01:14:33.128982 systemd-journald[341]: Journal started Jan 15 01:14:33.129004 systemd-journald[341]: Runtime Journal (/run/log/journal/0698f5eadb4d4a129fa6780908e52a90) is 8M, max 77.9M, 69.9M free. Jan 15 01:14:33.026000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-usr comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:33.068000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:33.101000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:33.106000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:33.112000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:33.097081 systemd-modules-load[343]: Inserted module 'br_netfilter' Jan 15 01:14:33.135345 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 15 01:14:33.137379 systemd[1]: Started systemd-journald.service - Journal Service. Jan 15 01:14:33.137000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:33.142325 kernel: audit: type=1130 audit(1768439673.137:7): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:33.146065 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 15 01:14:33.153995 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 15 01:14:33.159367 kernel: audit: type=1130 audit(1768439673.155:8): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:33.155000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:33.158283 systemd-tmpfiles[368]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Jan 15 01:14:33.161732 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Jan 15 01:14:33.163628 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 15 01:14:33.168553 kernel: audit: type=1130 audit(1768439673.163:9): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:33.163000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:33.164406 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 15 01:14:33.169000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:33.173503 kernel: audit: type=1130 audit(1768439673.169:10): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:33.178000 audit: BPF prog-id=6 op=LOAD Jan 15 01:14:33.179604 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 15 01:14:33.193171 dracut-cmdline[378]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack verity.usrhash=1042e64ca7212ba2a277cb872bdf1dc4e195c9fb8110078c443b3efbd2488cb9 Jan 15 01:14:33.232184 systemd-resolved[381]: Positive Trust Anchors: Jan 15 01:14:33.232198 systemd-resolved[381]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 15 01:14:33.232201 systemd-resolved[381]: . IN DS 38696 8 2 683d2d0acb8c9b712a1948b27f741219298d0a450d612c483af444a4c0fb2b16 Jan 15 01:14:33.232233 systemd-resolved[381]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 15 01:14:33.256020 systemd-resolved[381]: Defaulting to hostname 'linux'. Jan 15 01:14:33.258013 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 15 01:14:33.258000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:33.258746 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 15 01:14:33.314324 kernel: Loading iSCSI transport class v2.0-870. Jan 15 01:14:33.334314 kernel: iscsi: registered transport (tcp) Jan 15 01:14:33.357691 kernel: iscsi: registered transport (qla4xxx) Jan 15 01:14:33.357758 kernel: QLogic iSCSI HBA Driver Jan 15 01:14:33.387904 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jan 15 01:14:33.404141 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jan 15 01:14:33.405000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:33.407335 systemd[1]: Reached target network-pre.target - Preparation for Network. Jan 15 01:14:33.447320 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Jan 15 01:14:33.447000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:33.449800 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Jan 15 01:14:33.452516 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Jan 15 01:14:33.485000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:33.485134 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Jan 15 01:14:33.487000 audit: BPF prog-id=7 op=LOAD Jan 15 01:14:33.487000 audit: BPF prog-id=8 op=LOAD Jan 15 01:14:33.488425 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 15 01:14:33.516253 systemd-udevd[616]: Using default interface naming scheme 'v257'. Jan 15 01:14:33.526195 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 15 01:14:33.528000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:33.529901 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Jan 15 01:14:33.554230 dracut-pre-trigger[692]: rd.md=0: removing MD RAID activation Jan 15 01:14:33.558730 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 15 01:14:33.559000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:33.560000 audit: BPF prog-id=9 op=LOAD Jan 15 01:14:33.561662 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 15 01:14:33.584560 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Jan 15 01:14:33.584000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:33.587416 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 15 01:14:33.609564 systemd-networkd[735]: lo: Link UP Jan 15 01:14:33.610341 systemd-networkd[735]: lo: Gained carrier Jan 15 01:14:33.611314 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 15 01:14:33.611000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:33.611879 systemd[1]: Reached target network.target - Network. Jan 15 01:14:33.677569 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 15 01:14:33.677000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:33.680856 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Jan 15 01:14:33.770012 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Jan 15 01:14:33.800038 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Jan 15 01:14:33.810689 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Jan 15 01:14:33.821724 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Jan 15 01:14:33.827394 kernel: hid: raw HID events driver (C) Jiri Kosina Jan 15 01:14:33.828043 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Jan 15 01:14:33.839578 kernel: usbcore: registered new interface driver usbhid Jan 15 01:14:33.839640 kernel: usbhid: USB HID core driver Jan 15 01:14:33.851537 disk-uuid[795]: Primary Header is updated. Jan 15 01:14:33.851537 disk-uuid[795]: Secondary Entries is updated. Jan 15 01:14:33.851537 disk-uuid[795]: Secondary Header is updated. Jan 15 01:14:33.858725 kernel: cryptd: max_cpu_qlen set to 1000 Jan 15 01:14:33.882607 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.0/0000:01:00.0/0000:02:01.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input2 Jan 15 01:14:33.899593 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 15 01:14:33.899740 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 15 01:14:33.901000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:33.901835 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Jan 15 01:14:33.904011 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 15 01:14:33.910316 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:01.0-1/input0 Jan 15 01:14:33.918318 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input3 Jan 15 01:14:33.920033 systemd-networkd[735]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 15 01:14:33.920041 systemd-networkd[735]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 15 01:14:33.922000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:33.922000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:33.921133 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 15 01:14:33.921236 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 15 01:14:33.921794 systemd-networkd[735]: eth0: Link UP Jan 15 01:14:33.921951 systemd-networkd[735]: eth0: Gained carrier Jan 15 01:14:33.921964 systemd-networkd[735]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 15 01:14:33.929371 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 15 01:14:33.933391 systemd-networkd[735]: eth0: DHCPv4 address 10.0.7.78/25, gateway 10.0.7.1 acquired from 10.0.7.1 Jan 15 01:14:33.938313 kernel: AES CTR mode by8 optimization enabled Jan 15 01:14:33.997496 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 15 01:14:33.997000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:34.034123 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Jan 15 01:14:34.040544 kernel: kauditd_printk_skb: 17 callbacks suppressed Jan 15 01:14:34.040574 kernel: audit: type=1130 audit(1768439674.034:28): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:34.034000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:34.035353 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Jan 15 01:14:34.041396 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 15 01:14:34.042418 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 15 01:14:34.044524 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Jan 15 01:14:34.075649 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Jan 15 01:14:34.080248 kernel: audit: type=1130 audit(1768439674.075:29): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:34.075000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:34.913019 disk-uuid[797]: Warning: The kernel is still using the old partition table. Jan 15 01:14:34.913019 disk-uuid[797]: The new table will be used at the next reboot or after you Jan 15 01:14:34.913019 disk-uuid[797]: run partprobe(8) or kpartx(8) Jan 15 01:14:34.913019 disk-uuid[797]: The operation has completed successfully. Jan 15 01:14:34.918264 systemd[1]: disk-uuid.service: Deactivated successfully. Jan 15 01:14:34.918564 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Jan 15 01:14:34.927711 kernel: audit: type=1130 audit(1768439674.919:30): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:34.927747 kernel: audit: type=1131 audit(1768439674.919:31): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:34.919000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:34.919000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:34.922426 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Jan 15 01:14:34.988327 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (920) Jan 15 01:14:34.991390 kernel: BTRFS info (device vda6): first mount of filesystem 372d586b-dfcb-4c9b-8d15-cc0618567790 Jan 15 01:14:34.991470 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Jan 15 01:14:34.998667 kernel: BTRFS info (device vda6): turning on async discard Jan 15 01:14:34.998751 kernel: BTRFS info (device vda6): enabling free space tree Jan 15 01:14:35.006319 kernel: BTRFS info (device vda6): last unmount of filesystem 372d586b-dfcb-4c9b-8d15-cc0618567790 Jan 15 01:14:35.006916 systemd[1]: Finished ignition-setup.service - Ignition (setup). Jan 15 01:14:35.012453 kernel: audit: type=1130 audit(1768439675.007:32): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:35.007000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:35.008757 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Jan 15 01:14:35.224596 ignition[939]: Ignition 2.22.0 Jan 15 01:14:35.224609 ignition[939]: Stage: fetch-offline Jan 15 01:14:35.224654 ignition[939]: no configs at "/usr/lib/ignition/base.d" Jan 15 01:14:35.224664 ignition[939]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 15 01:14:35.228000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:35.227637 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Jan 15 01:14:35.232438 kernel: audit: type=1130 audit(1768439675.228:33): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:35.224757 ignition[939]: parsed url from cmdline: "" Jan 15 01:14:35.231454 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Jan 15 01:14:35.224760 ignition[939]: no config URL provided Jan 15 01:14:35.224765 ignition[939]: reading system config file "/usr/lib/ignition/user.ign" Jan 15 01:14:35.224773 ignition[939]: no config at "/usr/lib/ignition/user.ign" Jan 15 01:14:35.224778 ignition[939]: failed to fetch config: resource requires networking Jan 15 01:14:35.225248 ignition[939]: Ignition finished successfully Jan 15 01:14:35.260742 ignition[946]: Ignition 2.22.0 Jan 15 01:14:35.260754 ignition[946]: Stage: fetch Jan 15 01:14:35.260950 ignition[946]: no configs at "/usr/lib/ignition/base.d" Jan 15 01:14:35.260959 ignition[946]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 15 01:14:35.261047 ignition[946]: parsed url from cmdline: "" Jan 15 01:14:35.261289 ignition[946]: no config URL provided Jan 15 01:14:35.261318 ignition[946]: reading system config file "/usr/lib/ignition/user.ign" Jan 15 01:14:35.261328 ignition[946]: no config at "/usr/lib/ignition/user.ign" Jan 15 01:14:35.261420 ignition[946]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 Jan 15 01:14:35.261513 ignition[946]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Jan 15 01:14:35.261535 ignition[946]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Jan 15 01:14:35.931587 systemd-networkd[735]: eth0: Gained IPv6LL Jan 15 01:14:35.972977 ignition[946]: GET result: OK Jan 15 01:14:35.973093 ignition[946]: parsing config with SHA512: eff24e5391d3478ee3b56d4d6809d7888fb455114caf0f9e67f186ae438f1b9b3276eeda41f9cef98b93c2b54032975ebe5d81716ee1cb7e685cbc12907f0705 Jan 15 01:14:35.978399 unknown[946]: fetched base config from "system" Jan 15 01:14:35.978411 unknown[946]: fetched base config from "system" Jan 15 01:14:35.978416 unknown[946]: fetched user config from "openstack" Jan 15 01:14:35.979674 ignition[946]: fetch: fetch complete Jan 15 01:14:35.979679 ignition[946]: fetch: fetch passed Jan 15 01:14:35.979735 ignition[946]: Ignition finished successfully Jan 15 01:14:35.982084 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Jan 15 01:14:35.982000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:35.983967 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Jan 15 01:14:35.986378 kernel: audit: type=1130 audit(1768439675.982:34): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:36.023048 ignition[952]: Ignition 2.22.0 Jan 15 01:14:36.023060 ignition[952]: Stage: kargs Jan 15 01:14:36.023216 ignition[952]: no configs at "/usr/lib/ignition/base.d" Jan 15 01:14:36.023224 ignition[952]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 15 01:14:36.024026 ignition[952]: kargs: kargs passed Jan 15 01:14:36.024072 ignition[952]: Ignition finished successfully Jan 15 01:14:36.027917 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Jan 15 01:14:36.028000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:36.031445 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Jan 15 01:14:36.033385 kernel: audit: type=1130 audit(1768439676.028:35): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:36.071466 ignition[958]: Ignition 2.22.0 Jan 15 01:14:36.072307 ignition[958]: Stage: disks Jan 15 01:14:36.072462 ignition[958]: no configs at "/usr/lib/ignition/base.d" Jan 15 01:14:36.072471 ignition[958]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 15 01:14:36.073107 ignition[958]: disks: disks passed Jan 15 01:14:36.073148 ignition[958]: Ignition finished successfully Jan 15 01:14:36.075338 systemd[1]: Finished ignition-disks.service - Ignition (disks). Jan 15 01:14:36.075000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:36.077489 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Jan 15 01:14:36.080336 kernel: audit: type=1130 audit(1768439676.075:36): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:36.080448 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Jan 15 01:14:36.081181 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 15 01:14:36.081574 systemd[1]: Reached target sysinit.target - System Initialization. Jan 15 01:14:36.081899 systemd[1]: Reached target basic.target - Basic System. Jan 15 01:14:36.085245 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Jan 15 01:14:36.142313 systemd-fsck[967]: ROOT: clean, 15/1631200 files, 112378/1617920 blocks Jan 15 01:14:36.145597 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Jan 15 01:14:36.145000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:36.150375 kernel: audit: type=1130 audit(1768439676.145:37): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:36.150387 systemd[1]: Mounting sysroot.mount - /sysroot... Jan 15 01:14:36.296317 kernel: EXT4-fs (vda9): mounted filesystem 6f459a58-5046-4124-bfbc-09321f1e67d8 r/w with ordered data mode. Quota mode: none. Jan 15 01:14:36.297381 systemd[1]: Mounted sysroot.mount - /sysroot. Jan 15 01:14:36.298581 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Jan 15 01:14:36.301901 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 15 01:14:36.305939 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Jan 15 01:14:36.306800 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Jan 15 01:14:36.311456 systemd[1]: Starting flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent... Jan 15 01:14:36.313613 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Jan 15 01:14:36.313658 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Jan 15 01:14:36.320967 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Jan 15 01:14:36.323692 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Jan 15 01:14:36.333332 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (975) Jan 15 01:14:36.343576 kernel: BTRFS info (device vda6): first mount of filesystem 372d586b-dfcb-4c9b-8d15-cc0618567790 Jan 15 01:14:36.346307 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Jan 15 01:14:36.350753 kernel: BTRFS info (device vda6): turning on async discard Jan 15 01:14:36.350793 kernel: BTRFS info (device vda6): enabling free space tree Jan 15 01:14:36.353032 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 15 01:14:36.423342 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 15 01:14:36.452748 initrd-setup-root[1003]: cut: /sysroot/etc/passwd: No such file or directory Jan 15 01:14:36.460371 initrd-setup-root[1010]: cut: /sysroot/etc/group: No such file or directory Jan 15 01:14:36.465468 initrd-setup-root[1017]: cut: /sysroot/etc/shadow: No such file or directory Jan 15 01:14:36.470423 initrd-setup-root[1024]: cut: /sysroot/etc/gshadow: No such file or directory Jan 15 01:14:36.573049 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Jan 15 01:14:36.574000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:36.575858 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Jan 15 01:14:36.577708 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Jan 15 01:14:36.597891 systemd[1]: sysroot-oem.mount: Deactivated successfully. Jan 15 01:14:36.600425 kernel: BTRFS info (device vda6): last unmount of filesystem 372d586b-dfcb-4c9b-8d15-cc0618567790 Jan 15 01:14:36.621455 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Jan 15 01:14:36.621000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:36.643266 ignition[1091]: INFO : Ignition 2.22.0 Jan 15 01:14:36.643266 ignition[1091]: INFO : Stage: mount Jan 15 01:14:36.644586 ignition[1091]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 15 01:14:36.644586 ignition[1091]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 15 01:14:36.644586 ignition[1091]: INFO : mount: mount passed Jan 15 01:14:36.644586 ignition[1091]: INFO : Ignition finished successfully Jan 15 01:14:36.649910 systemd[1]: Finished ignition-mount.service - Ignition (mount). Jan 15 01:14:36.650000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:37.470313 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 15 01:14:39.474338 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 15 01:14:43.479332 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 15 01:14:43.483910 coreos-metadata[977]: Jan 15 01:14:43.483 WARN failed to locate config-drive, using the metadata service API instead Jan 15 01:14:43.498813 coreos-metadata[977]: Jan 15 01:14:43.498 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Jan 15 01:14:45.721622 coreos-metadata[977]: Jan 15 01:14:45.721 INFO Fetch successful Jan 15 01:14:45.723388 coreos-metadata[977]: Jan 15 01:14:45.723 INFO wrote hostname ci-4515-1-0-n-d76f075714 to /sysroot/etc/hostname Jan 15 01:14:45.724003 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. Jan 15 01:14:45.732106 kernel: kauditd_printk_skb: 3 callbacks suppressed Jan 15 01:14:45.732134 kernel: audit: type=1130 audit(1768439685.723:41): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:45.732147 kernel: audit: type=1131 audit(1768439685.723:42): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:45.723000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:45.723000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:45.724105 systemd[1]: Finished flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent. Jan 15 01:14:45.725763 systemd[1]: Starting ignition-files.service - Ignition (files)... Jan 15 01:14:45.752303 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 15 01:14:45.780330 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (1109) Jan 15 01:14:45.783329 kernel: BTRFS info (device vda6): first mount of filesystem 372d586b-dfcb-4c9b-8d15-cc0618567790 Jan 15 01:14:45.783391 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Jan 15 01:14:45.789643 kernel: BTRFS info (device vda6): turning on async discard Jan 15 01:14:45.789715 kernel: BTRFS info (device vda6): enabling free space tree Jan 15 01:14:45.791389 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 15 01:14:45.825551 ignition[1127]: INFO : Ignition 2.22.0 Jan 15 01:14:45.825551 ignition[1127]: INFO : Stage: files Jan 15 01:14:45.826965 ignition[1127]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 15 01:14:45.826965 ignition[1127]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 15 01:14:45.826965 ignition[1127]: DEBUG : files: compiled without relabeling support, skipping Jan 15 01:14:45.828479 ignition[1127]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Jan 15 01:14:45.828479 ignition[1127]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Jan 15 01:14:45.837017 ignition[1127]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Jan 15 01:14:45.837719 ignition[1127]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Jan 15 01:14:45.838166 ignition[1127]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Jan 15 01:14:45.837744 unknown[1127]: wrote ssh authorized keys file for user: core Jan 15 01:14:45.840571 ignition[1127]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Jan 15 01:14:45.841477 ignition[1127]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.0-linux-amd64.tar.gz: attempt #1 Jan 15 01:14:45.903564 ignition[1127]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Jan 15 01:14:46.032374 ignition[1127]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Jan 15 01:14:46.032374 ignition[1127]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Jan 15 01:14:46.032374 ignition[1127]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Jan 15 01:14:46.032374 ignition[1127]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Jan 15 01:14:46.032374 ignition[1127]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Jan 15 01:14:46.032374 ignition[1127]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 15 01:14:46.036581 ignition[1127]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 15 01:14:46.036581 ignition[1127]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 15 01:14:46.036581 ignition[1127]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 15 01:14:46.036581 ignition[1127]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Jan 15 01:14:46.036581 ignition[1127]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Jan 15 01:14:46.036581 ignition[1127]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Jan 15 01:14:46.036581 ignition[1127]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Jan 15 01:14:46.036581 ignition[1127]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Jan 15 01:14:46.036581 ignition[1127]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.32.4-x86-64.raw: attempt #1 Jan 15 01:14:46.318863 ignition[1127]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Jan 15 01:14:46.911942 ignition[1127]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Jan 15 01:14:46.911942 ignition[1127]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Jan 15 01:14:46.924477 kernel: audit: type=1130 audit(1768439686.919:43): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:46.919000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:46.924569 ignition[1127]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 15 01:14:46.924569 ignition[1127]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 15 01:14:46.924569 ignition[1127]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Jan 15 01:14:46.924569 ignition[1127]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Jan 15 01:14:46.924569 ignition[1127]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Jan 15 01:14:46.924569 ignition[1127]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Jan 15 01:14:46.924569 ignition[1127]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Jan 15 01:14:46.924569 ignition[1127]: INFO : files: files passed Jan 15 01:14:46.924569 ignition[1127]: INFO : Ignition finished successfully Jan 15 01:14:46.918598 systemd[1]: Finished ignition-files.service - Ignition (files). Jan 15 01:14:46.923453 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Jan 15 01:14:46.928349 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Jan 15 01:14:46.936226 systemd[1]: ignition-quench.service: Deactivated successfully. Jan 15 01:14:46.936444 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Jan 15 01:14:46.937000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:46.942359 kernel: audit: type=1130 audit(1768439686.937:44): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:46.941000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:46.947323 kernel: audit: type=1131 audit(1768439686.941:45): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:46.952129 initrd-setup-root-after-ignition[1159]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 15 01:14:46.952129 initrd-setup-root-after-ignition[1159]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Jan 15 01:14:46.954578 initrd-setup-root-after-ignition[1163]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 15 01:14:46.955741 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 15 01:14:46.957388 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Jan 15 01:14:46.956000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:46.961685 kernel: audit: type=1130 audit(1768439686.956:46): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:46.962888 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Jan 15 01:14:47.009000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:47.009057 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Jan 15 01:14:47.017597 kernel: audit: type=1130 audit(1768439687.009:47): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:47.017626 kernel: audit: type=1131 audit(1768439687.009:48): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:47.009000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:47.009151 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Jan 15 01:14:47.010234 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Jan 15 01:14:47.019179 systemd[1]: Reached target initrd.target - Initrd Default Target. Jan 15 01:14:47.019923 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Jan 15 01:14:47.020900 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Jan 15 01:14:47.040816 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 15 01:14:47.041000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:47.043161 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Jan 15 01:14:47.046948 kernel: audit: type=1130 audit(1768439687.041:49): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:47.063125 systemd[1]: Unnecessary job was removed for dev-mapper-usr.device - /dev/mapper/usr. Jan 15 01:14:47.064054 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Jan 15 01:14:47.064752 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 15 01:14:47.066322 systemd[1]: Stopped target timers.target - Timer Units. Jan 15 01:14:47.067368 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Jan 15 01:14:47.067487 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 15 01:14:47.068000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:47.069121 systemd[1]: Stopped target initrd.target - Initrd Default Target. Jan 15 01:14:47.073949 kernel: audit: type=1131 audit(1768439687.068:50): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:47.072991 systemd[1]: Stopped target basic.target - Basic System. Jan 15 01:14:47.074540 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Jan 15 01:14:47.075515 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Jan 15 01:14:47.076401 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Jan 15 01:14:47.076953 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Jan 15 01:14:47.077450 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Jan 15 01:14:47.078384 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Jan 15 01:14:47.079318 systemd[1]: Stopped target sysinit.target - System Initialization. Jan 15 01:14:47.080245 systemd[1]: Stopped target local-fs.target - Local File Systems. Jan 15 01:14:47.081105 systemd[1]: Stopped target swap.target - Swaps. Jan 15 01:14:47.082000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:47.081949 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Jan 15 01:14:47.082072 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Jan 15 01:14:47.083273 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Jan 15 01:14:47.084944 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 15 01:14:47.085414 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Jan 15 01:14:47.085496 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 15 01:14:47.086000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:47.086238 systemd[1]: dracut-initqueue.service: Deactivated successfully. Jan 15 01:14:47.086372 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Jan 15 01:14:47.088000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:47.087569 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Jan 15 01:14:47.088000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:47.087669 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 15 01:14:47.088442 systemd[1]: ignition-files.service: Deactivated successfully. Jan 15 01:14:47.088528 systemd[1]: Stopped ignition-files.service - Ignition (files). Jan 15 01:14:47.091452 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Jan 15 01:14:47.091858 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Jan 15 01:14:47.092000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:47.091965 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Jan 15 01:14:47.094441 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Jan 15 01:14:47.096000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:47.094868 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Jan 15 01:14:47.096000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:47.094976 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 15 01:14:47.099000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:47.096498 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Jan 15 01:14:47.096586 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Jan 15 01:14:47.097243 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Jan 15 01:14:47.097347 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Jan 15 01:14:47.105594 systemd[1]: initrd-cleanup.service: Deactivated successfully. Jan 15 01:14:47.106000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:47.106000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:47.106367 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Jan 15 01:14:47.121255 ignition[1183]: INFO : Ignition 2.22.0 Jan 15 01:14:47.123022 ignition[1183]: INFO : Stage: umount Jan 15 01:14:47.123022 ignition[1183]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 15 01:14:47.123022 ignition[1183]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 15 01:14:47.123022 ignition[1183]: INFO : umount: umount passed Jan 15 01:14:47.123022 ignition[1183]: INFO : Ignition finished successfully Jan 15 01:14:47.122052 systemd[1]: sysroot-boot.mount: Deactivated successfully. Jan 15 01:14:47.125567 systemd[1]: ignition-mount.service: Deactivated successfully. Jan 15 01:14:47.127000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:47.125736 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Jan 15 01:14:47.128066 systemd[1]: ignition-disks.service: Deactivated successfully. Jan 15 01:14:47.128165 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Jan 15 01:14:47.129000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:47.130000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:47.130088 systemd[1]: ignition-kargs.service: Deactivated successfully. Jan 15 01:14:47.130138 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Jan 15 01:14:47.131000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:47.130545 systemd[1]: ignition-fetch.service: Deactivated successfully. Jan 15 01:14:47.132000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:47.130586 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Jan 15 01:14:47.131764 systemd[1]: Stopped target network.target - Network. Jan 15 01:14:47.132377 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Jan 15 01:14:47.132419 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Jan 15 01:14:47.132962 systemd[1]: Stopped target paths.target - Path Units. Jan 15 01:14:47.133285 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Jan 15 01:14:47.134347 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 15 01:14:47.137000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:47.134754 systemd[1]: Stopped target slices.target - Slice Units. Jan 15 01:14:47.138000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup-pre comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:47.135070 systemd[1]: Stopped target sockets.target - Socket Units. Jan 15 01:14:47.135426 systemd[1]: iscsid.socket: Deactivated successfully. Jan 15 01:14:47.135463 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Jan 15 01:14:47.135803 systemd[1]: iscsiuio.socket: Deactivated successfully. Jan 15 01:14:47.135837 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 15 01:14:47.143000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:47.136727 systemd[1]: systemd-journald-audit.socket: Deactivated successfully. Jan 15 01:14:47.144000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:47.136753 systemd[1]: Closed systemd-journald-audit.socket - Journal Audit Socket. Jan 15 01:14:47.137467 systemd[1]: ignition-setup.service: Deactivated successfully. Jan 15 01:14:47.137511 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Jan 15 01:14:47.138145 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Jan 15 01:14:47.147000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:47.138179 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Jan 15 01:14:47.138901 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Jan 15 01:14:47.139506 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Jan 15 01:14:47.141621 systemd[1]: sysroot-boot.service: Deactivated successfully. Jan 15 01:14:47.148000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:47.141701 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Jan 15 01:14:47.143963 systemd[1]: initrd-setup-root.service: Deactivated successfully. Jan 15 01:14:47.144046 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Jan 15 01:14:47.146752 systemd[1]: systemd-resolved.service: Deactivated successfully. Jan 15 01:14:47.147050 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Jan 15 01:14:47.150000 audit: BPF prog-id=6 op=UNLOAD Jan 15 01:14:47.151000 audit: BPF prog-id=9 op=UNLOAD Jan 15 01:14:47.148606 systemd[1]: systemd-networkd.service: Deactivated successfully. Jan 15 01:14:47.148739 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Jan 15 01:14:47.151518 systemd[1]: Stopped target network-pre.target - Preparation for Network. Jan 15 01:14:47.152282 systemd[1]: systemd-networkd.socket: Deactivated successfully. Jan 15 01:14:47.152358 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Jan 15 01:14:47.154408 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Jan 15 01:14:47.155106 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Jan 15 01:14:47.155525 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 15 01:14:47.155000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:47.156347 systemd[1]: systemd-sysctl.service: Deactivated successfully. Jan 15 01:14:47.156786 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Jan 15 01:14:47.157000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:47.157575 systemd[1]: systemd-modules-load.service: Deactivated successfully. Jan 15 01:14:47.157978 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Jan 15 01:14:47.158000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:47.158762 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 15 01:14:47.172110 systemd[1]: systemd-udevd.service: Deactivated successfully. Jan 15 01:14:47.172793 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 15 01:14:47.172000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:47.174000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:47.173477 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Jan 15 01:14:47.173513 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Jan 15 01:14:47.176000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:47.173927 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Jan 15 01:14:47.173951 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Jan 15 01:14:47.174463 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Jan 15 01:14:47.177000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:47.174502 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Jan 15 01:14:47.175739 systemd[1]: dracut-cmdline.service: Deactivated successfully. Jan 15 01:14:47.175779 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Jan 15 01:14:47.176587 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jan 15 01:14:47.176623 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 15 01:14:47.180196 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Jan 15 01:14:47.180986 systemd[1]: systemd-network-generator.service: Deactivated successfully. Jan 15 01:14:47.181382 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Jan 15 01:14:47.181000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:47.182257 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Jan 15 01:14:47.182696 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 15 01:14:47.183000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:47.183535 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 15 01:14:47.183956 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 15 01:14:47.184000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:47.196317 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Jan 15 01:14:47.194000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:47.197031 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Jan 15 01:14:47.194000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:47.201000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=network-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:47.201671 systemd[1]: network-cleanup.service: Deactivated successfully. Jan 15 01:14:47.201769 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Jan 15 01:14:47.203874 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Jan 15 01:14:47.205582 systemd[1]: Starting initrd-switch-root.service - Switch Root... Jan 15 01:14:47.227148 systemd[1]: Switching root. Jan 15 01:14:47.270898 systemd-journald[341]: Journal stopped Jan 15 01:14:48.410208 systemd-journald[341]: Received SIGTERM from PID 1 (systemd). Jan 15 01:14:48.410305 kernel: SELinux: policy capability network_peer_controls=1 Jan 15 01:14:48.410329 kernel: SELinux: policy capability open_perms=1 Jan 15 01:14:48.410349 kernel: SELinux: policy capability extended_socket_class=1 Jan 15 01:14:48.410361 kernel: SELinux: policy capability always_check_network=0 Jan 15 01:14:48.410372 kernel: SELinux: policy capability cgroup_seclabel=1 Jan 15 01:14:48.410384 kernel: SELinux: policy capability nnp_nosuid_transition=1 Jan 15 01:14:48.410403 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Jan 15 01:14:48.410414 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Jan 15 01:14:48.410427 kernel: SELinux: policy capability userspace_initial_context=0 Jan 15 01:14:48.410441 systemd[1]: Successfully loaded SELinux policy in 66.591ms. Jan 15 01:14:48.410464 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 5.835ms. Jan 15 01:14:48.410476 systemd[1]: systemd 257.9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +IPE +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -BTF -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jan 15 01:14:48.410488 systemd[1]: Detected virtualization kvm. Jan 15 01:14:48.410502 systemd[1]: Detected architecture x86-64. Jan 15 01:14:48.410518 systemd[1]: Detected first boot. Jan 15 01:14:48.410539 systemd[1]: Hostname set to . Jan 15 01:14:48.410552 systemd[1]: Initializing machine ID from SMBIOS/DMI UUID. Jan 15 01:14:48.410563 zram_generator::config[1226]: No configuration found. Jan 15 01:14:48.410578 kernel: Guest personality initialized and is inactive Jan 15 01:14:48.410590 kernel: VMCI host device registered (name=vmci, major=10, minor=258) Jan 15 01:14:48.410601 kernel: Initialized host personality Jan 15 01:14:48.410611 kernel: NET: Registered PF_VSOCK protocol family Jan 15 01:14:48.410624 systemd[1]: Populated /etc with preset unit settings. Jan 15 01:14:48.410636 systemd[1]: initrd-switch-root.service: Deactivated successfully. Jan 15 01:14:48.410648 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Jan 15 01:14:48.410660 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Jan 15 01:14:48.410676 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Jan 15 01:14:48.410688 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Jan 15 01:14:48.410700 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Jan 15 01:14:48.410714 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Jan 15 01:14:48.410726 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Jan 15 01:14:48.410737 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Jan 15 01:14:48.410750 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Jan 15 01:14:48.410761 systemd[1]: Created slice user.slice - User and Session Slice. Jan 15 01:14:48.410773 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 15 01:14:48.410785 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 15 01:14:48.410798 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Jan 15 01:14:48.410811 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Jan 15 01:14:48.410823 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Jan 15 01:14:48.410836 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 15 01:14:48.410848 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Jan 15 01:14:48.410861 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 15 01:14:48.410873 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 15 01:14:48.410884 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Jan 15 01:14:48.410896 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Jan 15 01:14:48.410907 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Jan 15 01:14:48.410919 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Jan 15 01:14:48.410931 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 15 01:14:48.410943 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 15 01:14:48.410955 systemd[1]: Reached target remote-veritysetup.target - Remote Verity Protected Volumes. Jan 15 01:14:48.410966 systemd[1]: Reached target slices.target - Slice Units. Jan 15 01:14:48.410978 systemd[1]: Reached target swap.target - Swaps. Jan 15 01:14:48.410989 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Jan 15 01:14:48.411000 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Jan 15 01:14:48.411012 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Jan 15 01:14:48.411026 systemd[1]: Listening on systemd-journald-audit.socket - Journal Audit Socket. Jan 15 01:14:48.411037 systemd[1]: Listening on systemd-mountfsd.socket - DDI File System Mounter Socket. Jan 15 01:14:48.411048 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 15 01:14:48.411059 systemd[1]: Listening on systemd-nsresourced.socket - Namespace Resource Manager Socket. Jan 15 01:14:48.411071 systemd[1]: Listening on systemd-oomd.socket - Userspace Out-Of-Memory (OOM) Killer Socket. Jan 15 01:14:48.411082 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 15 01:14:48.411093 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 15 01:14:48.411106 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Jan 15 01:14:48.411118 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Jan 15 01:14:48.411128 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Jan 15 01:14:48.411139 systemd[1]: Mounting media.mount - External Media Directory... Jan 15 01:14:48.411151 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 15 01:14:48.411163 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Jan 15 01:14:48.411175 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Jan 15 01:14:48.411188 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Jan 15 01:14:48.411200 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Jan 15 01:14:48.411211 systemd[1]: Reached target machines.target - Containers. Jan 15 01:14:48.411222 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Jan 15 01:14:48.411234 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 15 01:14:48.411245 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 15 01:14:48.411257 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Jan 15 01:14:48.411271 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 15 01:14:48.411282 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 15 01:14:48.413310 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 15 01:14:48.413327 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Jan 15 01:14:48.413342 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 15 01:14:48.413354 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Jan 15 01:14:48.413366 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Jan 15 01:14:48.413378 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Jan 15 01:14:48.413390 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Jan 15 01:14:48.413401 systemd[1]: Stopped systemd-fsck-usr.service. Jan 15 01:14:48.413416 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 15 01:14:48.413427 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 15 01:14:48.413440 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 15 01:14:48.413450 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jan 15 01:14:48.413462 kernel: fuse: init (API version 7.41) Jan 15 01:14:48.413473 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Jan 15 01:14:48.413485 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Jan 15 01:14:48.413498 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 15 01:14:48.413510 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 15 01:14:48.413522 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Jan 15 01:14:48.413533 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Jan 15 01:14:48.413548 systemd[1]: Mounted media.mount - External Media Directory. Jan 15 01:14:48.413560 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Jan 15 01:14:48.413572 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Jan 15 01:14:48.413583 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Jan 15 01:14:48.413595 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 15 01:14:48.413606 systemd[1]: modprobe@configfs.service: Deactivated successfully. Jan 15 01:14:48.413617 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Jan 15 01:14:48.413633 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 15 01:14:48.413665 systemd-journald[1303]: Collecting audit messages is enabled. Jan 15 01:14:48.413688 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 15 01:14:48.413700 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 15 01:14:48.413712 systemd-journald[1303]: Journal started Jan 15 01:14:48.413736 systemd-journald[1303]: Runtime Journal (/run/log/journal/0698f5eadb4d4a129fa6780908e52a90) is 8M, max 77.9M, 69.9M free. Jan 15 01:14:48.307000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:48.311000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck-usr comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:48.315000 audit: BPF prog-id=14 op=UNLOAD Jan 15 01:14:48.315000 audit: BPF prog-id=13 op=UNLOAD Jan 15 01:14:48.316000 audit: BPF prog-id=15 op=LOAD Jan 15 01:14:48.316000 audit: BPF prog-id=16 op=LOAD Jan 15 01:14:48.316000 audit: BPF prog-id=17 op=LOAD Jan 15 01:14:48.392000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:48.403000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:48.403000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:48.406000 audit: CONFIG_CHANGE op=set audit_enabled=1 old=1 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 res=1 Jan 15 01:14:48.406000 audit[1303]: SYSCALL arch=c000003e syscall=46 success=yes exit=60 a0=3 a1=7ffd619e73b0 a2=4000 a3=0 items=0 ppid=1 pid=1303 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="systemd-journal" exe="/usr/lib/systemd/systemd-journald" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:14:48.406000 audit: PROCTITLE proctitle="/usr/lib/systemd/systemd-journald" Jan 15 01:14:48.411000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:48.411000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:48.112224 systemd[1]: Queued start job for default target multi-user.target. Jan 15 01:14:48.117286 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Jan 15 01:14:48.117768 systemd[1]: systemd-journald.service: Deactivated successfully. Jan 15 01:14:48.417322 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 15 01:14:48.417000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:48.417000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:48.419303 systemd[1]: Started systemd-journald.service - Journal Service. Jan 15 01:14:48.420000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:48.423057 systemd[1]: modprobe@fuse.service: Deactivated successfully. Jan 15 01:14:48.423475 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Jan 15 01:14:48.424000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:48.424000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:48.424587 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 15 01:14:48.425000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:48.425000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:48.425361 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 15 01:14:48.426116 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 15 01:14:48.427000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:48.427850 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Jan 15 01:14:48.427000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-remount-fs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:48.439661 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jan 15 01:14:48.441000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:48.446173 systemd[1]: Reached target network-pre.target - Preparation for Network. Jan 15 01:14:48.448959 systemd[1]: Listening on systemd-importd.socket - Disk Image Download Service Socket. Jan 15 01:14:48.450392 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Jan 15 01:14:48.450417 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 15 01:14:48.452671 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Jan 15 01:14:48.454495 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 15 01:14:48.454597 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Jan 15 01:14:48.457346 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Jan 15 01:14:48.460999 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Jan 15 01:14:48.462316 kernel: ACPI: bus type drm_connector registered Jan 15 01:14:48.461552 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 15 01:14:48.466578 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Jan 15 01:14:48.467389 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 15 01:14:48.469408 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 15 01:14:48.476362 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Jan 15 01:14:48.479363 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Jan 15 01:14:48.479000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=flatcar-tmpfiles comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:48.480139 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 15 01:14:48.480000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:48.480000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:48.480284 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 15 01:14:48.481065 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Jan 15 01:14:48.481000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-load-credentials comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:48.487461 systemd-journald[1303]: Time spent on flushing to /var/log/journal/0698f5eadb4d4a129fa6780908e52a90 is 74.102ms for 1842 entries. Jan 15 01:14:48.487461 systemd-journald[1303]: System Journal (/var/log/journal/0698f5eadb4d4a129fa6780908e52a90) is 8M, max 588.1M, 580.1M free. Jan 15 01:14:48.571345 systemd-journald[1303]: Received client request to flush runtime journal. Jan 15 01:14:48.571410 kernel: loop1: detected capacity change from 0 to 119256 Jan 15 01:14:48.571427 kernel: loop2: detected capacity change from 0 to 224512 Jan 15 01:14:48.512000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-random-seed comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:48.514000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:48.539000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:48.490622 systemd[1]: Starting systemd-sysusers.service - Create System Users... Jan 15 01:14:48.511659 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Jan 15 01:14:48.513404 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 15 01:14:48.515528 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Jan 15 01:14:48.519850 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Jan 15 01:14:48.539557 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 15 01:14:48.575346 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Jan 15 01:14:48.575000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-flush comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:48.595721 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Jan 15 01:14:48.595000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-machine-id-commit comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:48.626779 systemd[1]: Finished systemd-sysusers.service - Create System Users. Jan 15 01:14:48.628000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysusers comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:48.629000 audit: BPF prog-id=18 op=LOAD Jan 15 01:14:48.629000 audit: BPF prog-id=19 op=LOAD Jan 15 01:14:48.629000 audit: BPF prog-id=20 op=LOAD Jan 15 01:14:48.632000 audit: BPF prog-id=21 op=LOAD Jan 15 01:14:48.631496 systemd[1]: Starting systemd-oomd.service - Userspace Out-Of-Memory (OOM) Killer... Jan 15 01:14:48.634425 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 15 01:14:48.637328 kernel: loop3: detected capacity change from 0 to 111544 Jan 15 01:14:48.639392 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 15 01:14:48.656000 audit: BPF prog-id=22 op=LOAD Jan 15 01:14:48.658000 audit: BPF prog-id=23 op=LOAD Jan 15 01:14:48.658000 audit: BPF prog-id=24 op=LOAD Jan 15 01:14:48.661000 audit: BPF prog-id=25 op=LOAD Jan 15 01:14:48.660444 systemd[1]: Starting systemd-nsresourced.service - Namespace Resource Manager... Jan 15 01:14:48.661000 audit: BPF prog-id=26 op=LOAD Jan 15 01:14:48.661000 audit: BPF prog-id=27 op=LOAD Jan 15 01:14:48.663480 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Jan 15 01:14:48.682527 systemd-tmpfiles[1369]: ACLs are not supported, ignoring. Jan 15 01:14:48.682842 systemd-tmpfiles[1369]: ACLs are not supported, ignoring. Jan 15 01:14:48.687309 kernel: loop4: detected capacity change from 0 to 1656 Jan 15 01:14:48.690419 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 15 01:14:48.691000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:48.716000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-userdbd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:48.715788 systemd[1]: Started systemd-userdbd.service - User Database Manager. Jan 15 01:14:48.723306 kernel: loop5: detected capacity change from 0 to 119256 Jan 15 01:14:48.735529 systemd-nsresourced[1371]: Not setting up BPF subsystem, as functionality has been disabled at compile time. Jan 15 01:14:48.736411 systemd[1]: Started systemd-nsresourced.service - Namespace Resource Manager. Jan 15 01:14:48.736000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-nsresourced comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:48.741414 kernel: loop6: detected capacity change from 0 to 224512 Jan 15 01:14:48.764668 kernel: loop7: detected capacity change from 0 to 111544 Jan 15 01:14:48.791315 kernel: loop1: detected capacity change from 0 to 1656 Jan 15 01:14:48.797761 (sd-merge)[1381]: Using extensions 'containerd-flatcar.raw', 'docker-flatcar.raw', 'kubernetes.raw', 'oem-stackit.raw'. Jan 15 01:14:48.806587 (sd-merge)[1381]: Merged extensions into '/usr'. Jan 15 01:14:48.811892 systemd[1]: Reload requested from client PID 1347 ('systemd-sysext') (unit systemd-sysext.service)... Jan 15 01:14:48.811907 systemd[1]: Reloading... Jan 15 01:14:48.828922 systemd-oomd[1367]: No swap; memory pressure usage will be degraded Jan 15 01:14:48.839790 systemd-resolved[1368]: Positive Trust Anchors: Jan 15 01:14:48.840059 systemd-resolved[1368]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 15 01:14:48.840103 systemd-resolved[1368]: . IN DS 38696 8 2 683d2d0acb8c9b712a1948b27f741219298d0a450d612c483af444a4c0fb2b16 Jan 15 01:14:48.840162 systemd-resolved[1368]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 15 01:14:48.866497 systemd-resolved[1368]: Using system hostname 'ci-4515-1-0-n-d76f075714'. Jan 15 01:14:48.910322 zram_generator::config[1420]: No configuration found. Jan 15 01:14:49.098195 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Jan 15 01:14:49.098613 systemd[1]: Reloading finished in 286 ms. Jan 15 01:14:49.118331 systemd[1]: Started systemd-oomd.service - Userspace Out-Of-Memory (OOM) Killer. Jan 15 01:14:49.118000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-oomd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:49.120119 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 15 01:14:49.120000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:49.120985 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Jan 15 01:14:49.121000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-hwdb-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:49.121867 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Jan 15 01:14:49.121000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysext comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:49.125574 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 15 01:14:49.131684 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Jan 15 01:14:49.142424 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Jan 15 01:14:49.149453 systemd[1]: Starting ensure-sysext.service... Jan 15 01:14:49.153443 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 15 01:14:49.153000 audit: BPF prog-id=8 op=UNLOAD Jan 15 01:14:49.153000 audit: BPF prog-id=7 op=UNLOAD Jan 15 01:14:49.154000 audit: BPF prog-id=28 op=LOAD Jan 15 01:14:49.154000 audit: BPF prog-id=29 op=LOAD Jan 15 01:14:49.156493 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 15 01:14:49.158000 audit: BPF prog-id=30 op=LOAD Jan 15 01:14:49.158000 audit: BPF prog-id=21 op=UNLOAD Jan 15 01:14:49.160000 audit: BPF prog-id=31 op=LOAD Jan 15 01:14:49.161000 audit: BPF prog-id=18 op=UNLOAD Jan 15 01:14:49.161000 audit: BPF prog-id=32 op=LOAD Jan 15 01:14:49.161000 audit: BPF prog-id=33 op=LOAD Jan 15 01:14:49.161000 audit: BPF prog-id=19 op=UNLOAD Jan 15 01:14:49.161000 audit: BPF prog-id=20 op=UNLOAD Jan 15 01:14:49.161000 audit: BPF prog-id=34 op=LOAD Jan 15 01:14:49.162000 audit: BPF prog-id=22 op=UNLOAD Jan 15 01:14:49.162000 audit: BPF prog-id=35 op=LOAD Jan 15 01:14:49.162000 audit: BPF prog-id=36 op=LOAD Jan 15 01:14:49.162000 audit: BPF prog-id=23 op=UNLOAD Jan 15 01:14:49.162000 audit: BPF prog-id=24 op=UNLOAD Jan 15 01:14:49.162000 audit: BPF prog-id=37 op=LOAD Jan 15 01:14:49.163000 audit: BPF prog-id=15 op=UNLOAD Jan 15 01:14:49.163000 audit: BPF prog-id=38 op=LOAD Jan 15 01:14:49.163000 audit: BPF prog-id=39 op=LOAD Jan 15 01:14:49.163000 audit: BPF prog-id=16 op=UNLOAD Jan 15 01:14:49.163000 audit: BPF prog-id=17 op=UNLOAD Jan 15 01:14:49.163000 audit: BPF prog-id=40 op=LOAD Jan 15 01:14:49.163000 audit: BPF prog-id=25 op=UNLOAD Jan 15 01:14:49.163000 audit: BPF prog-id=41 op=LOAD Jan 15 01:14:49.163000 audit: BPF prog-id=42 op=LOAD Jan 15 01:14:49.163000 audit: BPF prog-id=26 op=UNLOAD Jan 15 01:14:49.163000 audit: BPF prog-id=27 op=UNLOAD Jan 15 01:14:49.166566 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Jan 15 01:14:49.167139 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Jan 15 01:14:49.172575 systemd[1]: Reload requested from client PID 1466 ('systemctl') (unit ensure-sysext.service)... Jan 15 01:14:49.172588 systemd[1]: Reloading... Jan 15 01:14:49.181848 systemd-tmpfiles[1467]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Jan 15 01:14:49.181875 systemd-tmpfiles[1467]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Jan 15 01:14:49.182071 systemd-tmpfiles[1467]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Jan 15 01:14:49.185615 systemd-tmpfiles[1467]: ACLs are not supported, ignoring. Jan 15 01:14:49.185710 systemd-tmpfiles[1467]: ACLs are not supported, ignoring. Jan 15 01:14:49.199513 systemd-tmpfiles[1467]: Detected autofs mount point /boot during canonicalization of boot. Jan 15 01:14:49.199525 systemd-tmpfiles[1467]: Skipping /boot Jan 15 01:14:49.210559 systemd-tmpfiles[1467]: Detected autofs mount point /boot during canonicalization of boot. Jan 15 01:14:49.210570 systemd-tmpfiles[1467]: Skipping /boot Jan 15 01:14:49.223754 systemd-udevd[1468]: Using default interface naming scheme 'v257'. Jan 15 01:14:49.258320 zram_generator::config[1502]: No configuration found. Jan 15 01:14:49.369345 kernel: mousedev: PS/2 mouse device common for all mice Jan 15 01:14:49.388308 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input4 Jan 15 01:14:49.395352 kernel: ACPI: button: Power Button [PWRF] Jan 15 01:14:49.519670 kernel: i801_smbus 0000:00:1f.3: Enabling SMBus device Jan 15 01:14:49.519971 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Jan 15 01:14:49.521480 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Jan 15 01:14:49.557881 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Jan 15 01:14:49.558626 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Jan 15 01:14:49.558924 systemd[1]: Reloading finished in 386 ms. Jan 15 01:14:49.568334 kernel: [drm] pci: virtio-vga detected at 0000:00:01.0 Jan 15 01:14:49.569000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:49.569317 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 15 01:14:49.570000 audit: BPF prog-id=43 op=LOAD Jan 15 01:14:49.571000 audit: BPF prog-id=44 op=LOAD Jan 15 01:14:49.571000 audit: BPF prog-id=28 op=UNLOAD Jan 15 01:14:49.571000 audit: BPF prog-id=29 op=UNLOAD Jan 15 01:14:49.573000 audit: BPF prog-id=45 op=LOAD Jan 15 01:14:49.573000 audit: BPF prog-id=34 op=UNLOAD Jan 15 01:14:49.573000 audit: BPF prog-id=46 op=LOAD Jan 15 01:14:49.573000 audit: BPF prog-id=47 op=LOAD Jan 15 01:14:49.573000 audit: BPF prog-id=35 op=UNLOAD Jan 15 01:14:49.573000 audit: BPF prog-id=36 op=UNLOAD Jan 15 01:14:49.573000 audit: BPF prog-id=48 op=LOAD Jan 15 01:14:49.573000 audit: BPF prog-id=40 op=UNLOAD Jan 15 01:14:49.574000 audit: BPF prog-id=49 op=LOAD Jan 15 01:14:49.574000 audit: BPF prog-id=50 op=LOAD Jan 15 01:14:49.574000 audit: BPF prog-id=41 op=UNLOAD Jan 15 01:14:49.574000 audit: BPF prog-id=42 op=UNLOAD Jan 15 01:14:49.574000 audit: BPF prog-id=51 op=LOAD Jan 15 01:14:49.574000 audit: BPF prog-id=30 op=UNLOAD Jan 15 01:14:49.575000 audit: BPF prog-id=52 op=LOAD Jan 15 01:14:49.575000 audit: BPF prog-id=31 op=UNLOAD Jan 15 01:14:49.575000 audit: BPF prog-id=53 op=LOAD Jan 15 01:14:49.575000 audit: BPF prog-id=54 op=LOAD Jan 15 01:14:49.575000 audit: BPF prog-id=32 op=UNLOAD Jan 15 01:14:49.575000 audit: BPF prog-id=33 op=UNLOAD Jan 15 01:14:49.576000 audit: BPF prog-id=55 op=LOAD Jan 15 01:14:49.576000 audit: BPF prog-id=37 op=UNLOAD Jan 15 01:14:49.576000 audit: BPF prog-id=56 op=LOAD Jan 15 01:14:49.576000 audit: BPF prog-id=57 op=LOAD Jan 15 01:14:49.576000 audit: BPF prog-id=38 op=UNLOAD Jan 15 01:14:49.576000 audit: BPF prog-id=39 op=UNLOAD Jan 15 01:14:49.585000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:49.585398 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 15 01:14:49.599517 kernel: Console: switching to colour dummy device 80x25 Jan 15 01:14:49.599573 kernel: virtio-pci 0000:00:01.0: vgaarb: deactivate vga console Jan 15 01:14:49.599779 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Jan 15 01:14:49.599803 kernel: [drm] features: -context_init Jan 15 01:14:49.624311 kernel: [drm] number of scanouts: 1 Jan 15 01:14:49.632325 kernel: [drm] number of cap sets: 0 Jan 15 01:14:49.637444 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 15 01:14:49.639643 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 15 01:14:49.643377 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Jan 15 01:14:49.643640 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 15 01:14:49.646540 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 15 01:14:49.648708 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 15 01:14:49.649363 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:01.0 on minor 0 Jan 15 01:14:49.656557 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 15 01:14:49.659552 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 15 01:14:49.666426 systemd[1]: Starting modprobe@ptp_kvm.service - Load Kernel Module ptp_kvm... Jan 15 01:14:49.666654 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 15 01:14:49.666815 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Jan 15 01:14:49.669345 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Jan 15 01:14:49.671958 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Jan 15 01:14:49.672037 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 15 01:14:49.674000 audit: BPF prog-id=58 op=LOAD Jan 15 01:14:49.674105 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Jan 15 01:14:49.678360 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 15 01:14:49.678493 systemd[1]: Reached target time-set.target - System Time Set. Jan 15 01:14:49.682226 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Jan 15 01:14:49.682352 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 15 01:14:49.686046 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device Jan 15 01:14:49.686089 kernel: Console: switching to colour frame buffer device 160x50 Jan 15 01:14:49.694513 systemd[1]: Finished ensure-sysext.service. Jan 15 01:14:49.741799 kernel: virtio-pci 0000:00:01.0: [drm] fb0: virtio_gpudrmfb frame buffer device Jan 15 01:14:49.742000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=ensure-sysext comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:49.743000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-catalog-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:49.743511 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Jan 15 01:14:49.753000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck@dev-disk-by\x2dlabel-OEM comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:49.751949 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Jan 15 01:14:49.758000 audit[1600]: SYSTEM_BOOT pid=1600 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg=' comm="systemd-update-utmp" exe="/usr/lib/systemd/systemd-update-utmp" hostname=? addr=? terminal=? res=success' Jan 15 01:14:49.760630 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 15 01:14:49.793000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-update-utmp comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:49.792345 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Jan 15 01:14:49.797228 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 15 01:14:49.798189 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 15 01:14:49.798000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:49.798000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:49.800000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:49.800000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:49.799866 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 15 01:14:49.800054 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 15 01:14:49.800845 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 15 01:14:49.803000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:49.803000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:49.801337 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 15 01:14:49.803909 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 15 01:14:49.804787 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 15 01:14:49.805000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:49.805000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:49.816638 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 15 01:14:49.817038 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 15 01:14:49.820570 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 15 01:14:49.824000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:49.824000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:14:49.824486 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 15 01:14:49.834108 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 15 01:14:49.841000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=add_rule key=(null) list=5 res=1 Jan 15 01:14:49.841000 audit[1638]: SYSCALL arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7fffd76975c0 a2=420 a3=0 items=0 ppid=1589 pid=1638 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:14:49.841000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 15 01:14:49.842824 augenrules[1638]: No rules Jan 15 01:14:49.843482 systemd[1]: audit-rules.service: Deactivated successfully. Jan 15 01:14:49.843819 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 15 01:14:49.845308 kernel: pps_core: LinuxPPS API ver. 1 registered Jan 15 01:14:49.845346 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Jan 15 01:14:49.857312 kernel: PTP clock support registered Jan 15 01:14:49.861160 systemd[1]: modprobe@ptp_kvm.service: Deactivated successfully. Jan 15 01:14:49.861588 systemd[1]: Finished modprobe@ptp_kvm.service - Load Kernel Module ptp_kvm. Jan 15 01:14:49.895576 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Jan 15 01:14:49.897140 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Jan 15 01:14:49.901242 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 15 01:14:49.902045 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 15 01:14:49.908435 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 15 01:14:49.953433 systemd-networkd[1599]: lo: Link UP Jan 15 01:14:49.953694 systemd-networkd[1599]: lo: Gained carrier Jan 15 01:14:49.955973 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 15 01:14:49.956163 systemd[1]: Reached target network.target - Network. Jan 15 01:14:49.956476 systemd-networkd[1599]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 15 01:14:49.956537 systemd-networkd[1599]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 15 01:14:49.957078 systemd-networkd[1599]: eth0: Link UP Jan 15 01:14:49.957412 systemd-networkd[1599]: eth0: Gained carrier Jan 15 01:14:49.957469 systemd-networkd[1599]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 15 01:14:49.960454 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Jan 15 01:14:49.962028 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Jan 15 01:14:49.970349 systemd-networkd[1599]: eth0: DHCPv4 address 10.0.7.78/25, gateway 10.0.7.1 acquired from 10.0.7.1 Jan 15 01:14:50.006099 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Jan 15 01:14:50.045516 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 15 01:14:50.643417 ldconfig[1596]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Jan 15 01:14:50.649269 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Jan 15 01:14:50.653562 systemd[1]: Starting systemd-update-done.service - Update is Completed... Jan 15 01:14:50.674020 systemd[1]: Finished systemd-update-done.service - Update is Completed. Jan 15 01:14:50.675777 systemd[1]: Reached target sysinit.target - System Initialization. Jan 15 01:14:50.676863 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Jan 15 01:14:50.678629 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Jan 15 01:14:50.679059 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Jan 15 01:14:50.679591 systemd[1]: Started logrotate.timer - Daily rotation of log files. Jan 15 01:14:50.680018 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Jan 15 01:14:50.681529 systemd[1]: Started systemd-sysupdate-reboot.timer - Reboot Automatically After System Update. Jan 15 01:14:50.681980 systemd[1]: Started systemd-sysupdate.timer - Automatic System Update. Jan 15 01:14:50.682347 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Jan 15 01:14:50.682702 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Jan 15 01:14:50.682735 systemd[1]: Reached target paths.target - Path Units. Jan 15 01:14:50.683067 systemd[1]: Reached target timers.target - Timer Units. Jan 15 01:14:50.687042 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Jan 15 01:14:50.688790 systemd[1]: Starting docker.socket - Docker Socket for the API... Jan 15 01:14:50.692073 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Jan 15 01:14:50.694055 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Jan 15 01:14:50.695940 systemd[1]: Reached target ssh-access.target - SSH Access Available. Jan 15 01:14:50.708069 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Jan 15 01:14:50.711512 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Jan 15 01:14:50.713727 systemd[1]: Listening on docker.socket - Docker Socket for the API. Jan 15 01:14:50.715597 systemd[1]: Reached target sockets.target - Socket Units. Jan 15 01:14:50.717036 systemd[1]: Reached target basic.target - Basic System. Jan 15 01:14:50.717910 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Jan 15 01:14:50.718058 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Jan 15 01:14:50.720821 systemd[1]: Starting chronyd.service - NTP client/server... Jan 15 01:14:50.723434 systemd[1]: Starting containerd.service - containerd container runtime... Jan 15 01:14:50.729666 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Jan 15 01:14:50.733432 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Jan 15 01:14:50.736443 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Jan 15 01:14:50.739428 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Jan 15 01:14:50.745554 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Jan 15 01:14:50.747379 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Jan 15 01:14:50.749694 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Jan 15 01:14:50.755508 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 15 01:14:50.756117 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Jan 15 01:14:50.762455 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Jan 15 01:14:50.765489 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Jan 15 01:14:50.767758 jq[1670]: false Jan 15 01:14:50.773450 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Jan 15 01:14:50.781952 systemd[1]: Starting systemd-logind.service - User Login Management... Jan 15 01:14:50.785896 extend-filesystems[1671]: Found /dev/vda6 Jan 15 01:14:50.783146 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Jan 15 01:14:50.786561 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Jan 15 01:14:50.788183 systemd[1]: Starting update-engine.service - Update Engine... Jan 15 01:14:50.801318 extend-filesystems[1671]: Found /dev/vda9 Jan 15 01:14:50.800506 oslogin_cache_refresh[1672]: Refreshing passwd entry cache Jan 15 01:14:50.802062 google_oslogin_nss_cache[1672]: oslogin_cache_refresh[1672]: Refreshing passwd entry cache Jan 15 01:14:50.802799 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Jan 15 01:14:50.810857 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Jan 15 01:14:50.819679 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Jan 15 01:14:50.819923 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Jan 15 01:14:50.820965 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Jan 15 01:14:50.821151 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Jan 15 01:14:50.829457 extend-filesystems[1671]: Checking size of /dev/vda9 Jan 15 01:14:50.842460 google_oslogin_nss_cache[1672]: oslogin_cache_refresh[1672]: Failure getting users, quitting Jan 15 01:14:50.842460 google_oslogin_nss_cache[1672]: oslogin_cache_refresh[1672]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Jan 15 01:14:50.842460 google_oslogin_nss_cache[1672]: oslogin_cache_refresh[1672]: Refreshing group entry cache Jan 15 01:14:50.842364 oslogin_cache_refresh[1672]: Failure getting users, quitting Jan 15 01:14:50.842380 oslogin_cache_refresh[1672]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Jan 15 01:14:50.842420 oslogin_cache_refresh[1672]: Refreshing group entry cache Jan 15 01:14:50.851802 update_engine[1684]: I20260115 01:14:50.851727 1684 main.cc:92] Flatcar Update Engine starting Jan 15 01:14:50.856888 oslogin_cache_refresh[1672]: Failure getting groups, quitting Jan 15 01:14:50.857975 google_oslogin_nss_cache[1672]: oslogin_cache_refresh[1672]: Failure getting groups, quitting Jan 15 01:14:50.857975 google_oslogin_nss_cache[1672]: oslogin_cache_refresh[1672]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Jan 15 01:14:50.856899 oslogin_cache_refresh[1672]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Jan 15 01:14:50.860484 extend-filesystems[1671]: Resized partition /dev/vda9 Jan 15 01:14:50.870175 jq[1685]: true Jan 15 01:14:50.864427 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Jan 15 01:14:50.864668 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Jan 15 01:14:50.869864 systemd[1]: motdgen.service: Deactivated successfully. Jan 15 01:14:50.870636 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Jan 15 01:14:50.874967 extend-filesystems[1716]: resize2fs 1.47.3 (8-Jul-2025) Jan 15 01:14:50.885659 kernel: EXT4-fs (vda9): resizing filesystem from 1617920 to 11516923 blocks Jan 15 01:14:50.897898 jq[1719]: true Jan 15 01:14:50.899106 dbus-daemon[1668]: [system] SELinux support is enabled Jan 15 01:14:50.899327 systemd[1]: Started dbus.service - D-Bus System Message Bus. Jan 15 01:14:50.904572 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Jan 15 01:14:50.904598 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Jan 15 01:14:50.909404 tar[1694]: linux-amd64/LICENSE Jan 15 01:14:50.909404 tar[1694]: linux-amd64/helm Jan 15 01:14:50.910024 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Jan 15 01:14:50.910356 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Jan 15 01:14:50.927174 chronyd[1665]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER -SIGND +NTS +SECHASH +IPV6 -DEBUG) Jan 15 01:14:50.931402 update_engine[1684]: I20260115 01:14:50.930527 1684 update_check_scheduler.cc:74] Next update check in 3m45s Jan 15 01:14:50.928097 systemd[1]: Started update-engine.service - Update Engine. Jan 15 01:14:50.934879 chronyd[1665]: Loaded seccomp filter (level 2) Jan 15 01:14:50.936033 systemd[1]: Started locksmithd.service - Cluster reboot manager. Jan 15 01:14:50.937347 systemd[1]: Started chronyd.service - NTP client/server. Jan 15 01:14:51.029147 systemd-logind[1682]: New seat seat0. Jan 15 01:14:51.054094 systemd-logind[1682]: Watching system buttons on /dev/input/event3 (Power Button) Jan 15 01:14:51.054151 systemd-logind[1682]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Jan 15 01:14:51.055255 systemd[1]: Started systemd-logind.service - User Login Management. Jan 15 01:14:51.101944 bash[1740]: Updated "/home/core/.ssh/authorized_keys" Jan 15 01:14:51.104027 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Jan 15 01:14:51.110354 systemd[1]: Starting sshkeys.service... Jan 15 01:14:51.157799 locksmithd[1724]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Jan 15 01:14:51.166670 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Jan 15 01:14:51.172167 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Jan 15 01:14:51.195830 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 15 01:14:51.265722 containerd[1715]: time="2026-01-15T01:14:51Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Jan 15 01:14:51.266851 containerd[1715]: time="2026-01-15T01:14:51.266806571Z" level=info msg="starting containerd" revision=fcd43222d6b07379a4be9786bda52438f0dd16a1 version=v2.1.5 Jan 15 01:14:51.275045 containerd[1715]: time="2026-01-15T01:14:51.274994029Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="8.806µs" Jan 15 01:14:51.275045 containerd[1715]: time="2026-01-15T01:14:51.275028787Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Jan 15 01:14:51.275146 containerd[1715]: time="2026-01-15T01:14:51.275073635Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Jan 15 01:14:51.275146 containerd[1715]: time="2026-01-15T01:14:51.275090303Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Jan 15 01:14:51.275229 containerd[1715]: time="2026-01-15T01:14:51.275220612Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Jan 15 01:14:51.275247 containerd[1715]: time="2026-01-15T01:14:51.275235537Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jan 15 01:14:51.276339 containerd[1715]: time="2026-01-15T01:14:51.275285407Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jan 15 01:14:51.276339 containerd[1715]: time="2026-01-15T01:14:51.275312275Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jan 15 01:14:51.276339 containerd[1715]: time="2026-01-15T01:14:51.275650463Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jan 15 01:14:51.276339 containerd[1715]: time="2026-01-15T01:14:51.275670743Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jan 15 01:14:51.276339 containerd[1715]: time="2026-01-15T01:14:51.275685168Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jan 15 01:14:51.276339 containerd[1715]: time="2026-01-15T01:14:51.275696550Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.erofs type=io.containerd.snapshotter.v1 Jan 15 01:14:51.276339 containerd[1715]: time="2026-01-15T01:14:51.275832405Z" level=info msg="skip loading plugin" error="EROFS unsupported, please `modprobe erofs`: skip plugin" id=io.containerd.snapshotter.v1.erofs type=io.containerd.snapshotter.v1 Jan 15 01:14:51.276339 containerd[1715]: time="2026-01-15T01:14:51.275842868Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Jan 15 01:14:51.276339 containerd[1715]: time="2026-01-15T01:14:51.275916251Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Jan 15 01:14:51.276339 containerd[1715]: time="2026-01-15T01:14:51.276059571Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jan 15 01:14:51.276339 containerd[1715]: time="2026-01-15T01:14:51.276087112Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jan 15 01:14:51.276575 containerd[1715]: time="2026-01-15T01:14:51.276099123Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Jan 15 01:14:51.276575 containerd[1715]: time="2026-01-15T01:14:51.276123322Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Jan 15 01:14:51.277025 containerd[1715]: time="2026-01-15T01:14:51.277007136Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Jan 15 01:14:51.277120 containerd[1715]: time="2026-01-15T01:14:51.277109149Z" level=info msg="metadata content store policy set" policy=shared Jan 15 01:14:51.290319 kernel: EXT4-fs (vda9): resized filesystem to 11516923 Jan 15 01:14:51.291741 systemd-networkd[1599]: eth0: Gained IPv6LL Jan 15 01:14:51.293832 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Jan 15 01:14:51.296551 systemd[1]: Reached target network-online.target - Network is Online. Jan 15 01:14:51.300587 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 15 01:14:51.304614 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Jan 15 01:14:51.320233 containerd[1715]: time="2026-01-15T01:14:51.320177613Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Jan 15 01:14:51.320818 extend-filesystems[1716]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Jan 15 01:14:51.320818 extend-filesystems[1716]: old_desc_blocks = 1, new_desc_blocks = 6 Jan 15 01:14:51.320818 extend-filesystems[1716]: The filesystem on /dev/vda9 is now 11516923 (4k) blocks long. Jan 15 01:14:51.336501 extend-filesystems[1671]: Resized filesystem in /dev/vda9 Jan 15 01:14:51.337010 containerd[1715]: time="2026-01-15T01:14:51.320952143Z" level=info msg="loading plugin" id=io.containerd.differ.v1.erofs type=io.containerd.differ.v1 Jan 15 01:14:51.337010 containerd[1715]: time="2026-01-15T01:14:51.325423034Z" level=info msg="skip loading plugin" error="could not find mkfs.erofs: exec: \"mkfs.erofs\": executable file not found in $PATH: skip plugin" id=io.containerd.differ.v1.erofs type=io.containerd.differ.v1 Jan 15 01:14:51.337010 containerd[1715]: time="2026-01-15T01:14:51.325446701Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Jan 15 01:14:51.337010 containerd[1715]: time="2026-01-15T01:14:51.325471159Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Jan 15 01:14:51.337010 containerd[1715]: time="2026-01-15T01:14:51.325484302Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Jan 15 01:14:51.337010 containerd[1715]: time="2026-01-15T01:14:51.325497665Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Jan 15 01:14:51.337010 containerd[1715]: time="2026-01-15T01:14:51.325506948Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Jan 15 01:14:51.337010 containerd[1715]: time="2026-01-15T01:14:51.325518423Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Jan 15 01:14:51.337010 containerd[1715]: time="2026-01-15T01:14:51.325530217Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Jan 15 01:14:51.337010 containerd[1715]: time="2026-01-15T01:14:51.325542169Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Jan 15 01:14:51.337010 containerd[1715]: time="2026-01-15T01:14:51.325558327Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Jan 15 01:14:51.337010 containerd[1715]: time="2026-01-15T01:14:51.325568755Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Jan 15 01:14:51.337010 containerd[1715]: time="2026-01-15T01:14:51.325583557Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Jan 15 01:14:51.322502 systemd[1]: extend-filesystems.service: Deactivated successfully. Jan 15 01:14:51.339507 containerd[1715]: time="2026-01-15T01:14:51.325685464Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Jan 15 01:14:51.339507 containerd[1715]: time="2026-01-15T01:14:51.325702320Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Jan 15 01:14:51.339507 containerd[1715]: time="2026-01-15T01:14:51.325716345Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Jan 15 01:14:51.339507 containerd[1715]: time="2026-01-15T01:14:51.325731576Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Jan 15 01:14:51.339507 containerd[1715]: time="2026-01-15T01:14:51.325741497Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Jan 15 01:14:51.339507 containerd[1715]: time="2026-01-15T01:14:51.325751023Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Jan 15 01:14:51.339507 containerd[1715]: time="2026-01-15T01:14:51.325764062Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Jan 15 01:14:51.339507 containerd[1715]: time="2026-01-15T01:14:51.325779683Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Jan 15 01:14:51.339507 containerd[1715]: time="2026-01-15T01:14:51.325791533Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Jan 15 01:14:51.339507 containerd[1715]: time="2026-01-15T01:14:51.325801670Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Jan 15 01:14:51.339507 containerd[1715]: time="2026-01-15T01:14:51.325810943Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Jan 15 01:14:51.339507 containerd[1715]: time="2026-01-15T01:14:51.325838465Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Jan 15 01:14:51.339507 containerd[1715]: time="2026-01-15T01:14:51.325882672Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Jan 15 01:14:51.339507 containerd[1715]: time="2026-01-15T01:14:51.325896839Z" level=info msg="Start snapshots syncer" Jan 15 01:14:51.339507 containerd[1715]: time="2026-01-15T01:14:51.325917946Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Jan 15 01:14:51.323085 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Jan 15 01:14:51.339812 containerd[1715]: time="2026-01-15T01:14:51.326623419Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"cgroupWritable\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"\",\"binDirs\":[\"/opt/cni/bin\"],\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogLineSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Jan 15 01:14:51.339812 containerd[1715]: time="2026-01-15T01:14:51.328528092Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Jan 15 01:14:51.339933 containerd[1715]: time="2026-01-15T01:14:51.330679711Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Jan 15 01:14:51.339933 containerd[1715]: time="2026-01-15T01:14:51.333574221Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Jan 15 01:14:51.339933 containerd[1715]: time="2026-01-15T01:14:51.333612300Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Jan 15 01:14:51.339933 containerd[1715]: time="2026-01-15T01:14:51.333626622Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Jan 15 01:14:51.339933 containerd[1715]: time="2026-01-15T01:14:51.333638290Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Jan 15 01:14:51.339933 containerd[1715]: time="2026-01-15T01:14:51.333650299Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Jan 15 01:14:51.339933 containerd[1715]: time="2026-01-15T01:14:51.333663428Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Jan 15 01:14:51.339933 containerd[1715]: time="2026-01-15T01:14:51.333674690Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Jan 15 01:14:51.339933 containerd[1715]: time="2026-01-15T01:14:51.333684309Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Jan 15 01:14:51.339933 containerd[1715]: time="2026-01-15T01:14:51.333693784Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Jan 15 01:14:51.339933 containerd[1715]: time="2026-01-15T01:14:51.335427050Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jan 15 01:14:51.339933 containerd[1715]: time="2026-01-15T01:14:51.335452205Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jan 15 01:14:51.339933 containerd[1715]: time="2026-01-15T01:14:51.335461603Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jan 15 01:14:51.340136 containerd[1715]: time="2026-01-15T01:14:51.335470774Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jan 15 01:14:51.340136 containerd[1715]: time="2026-01-15T01:14:51.335478178Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Jan 15 01:14:51.340136 containerd[1715]: time="2026-01-15T01:14:51.335487287Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Jan 15 01:14:51.340136 containerd[1715]: time="2026-01-15T01:14:51.335498666Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Jan 15 01:14:51.340136 containerd[1715]: time="2026-01-15T01:14:51.335516131Z" level=info msg="runtime interface created" Jan 15 01:14:51.340136 containerd[1715]: time="2026-01-15T01:14:51.335521132Z" level=info msg="created NRI interface" Jan 15 01:14:51.340136 containerd[1715]: time="2026-01-15T01:14:51.335530024Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Jan 15 01:14:51.340136 containerd[1715]: time="2026-01-15T01:14:51.335545514Z" level=info msg="Connect containerd service" Jan 15 01:14:51.340136 containerd[1715]: time="2026-01-15T01:14:51.335580605Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Jan 15 01:14:51.340136 containerd[1715]: time="2026-01-15T01:14:51.336159960Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jan 15 01:14:51.405884 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Jan 15 01:14:51.517575 sshd_keygen[1711]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Jan 15 01:14:51.526289 containerd[1715]: time="2026-01-15T01:14:51.526080011Z" level=info msg="Start subscribing containerd event" Jan 15 01:14:51.526289 containerd[1715]: time="2026-01-15T01:14:51.526145653Z" level=info msg="Start recovering state" Jan 15 01:14:51.526289 containerd[1715]: time="2026-01-15T01:14:51.526242844Z" level=info msg="Start event monitor" Jan 15 01:14:51.526289 containerd[1715]: time="2026-01-15T01:14:51.526256403Z" level=info msg="Start cni network conf syncer for default" Jan 15 01:14:51.526289 containerd[1715]: time="2026-01-15T01:14:51.526265631Z" level=info msg="Start streaming server" Jan 15 01:14:51.527826 containerd[1715]: time="2026-01-15T01:14:51.526273271Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Jan 15 01:14:51.527826 containerd[1715]: time="2026-01-15T01:14:51.526476831Z" level=info msg="runtime interface starting up..." Jan 15 01:14:51.527826 containerd[1715]: time="2026-01-15T01:14:51.526483265Z" level=info msg="starting plugins..." Jan 15 01:14:51.527826 containerd[1715]: time="2026-01-15T01:14:51.526495674Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Jan 15 01:14:51.527939 containerd[1715]: time="2026-01-15T01:14:51.527765094Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Jan 15 01:14:51.527939 containerd[1715]: time="2026-01-15T01:14:51.527903175Z" level=info msg=serving... address=/run/containerd/containerd.sock Jan 15 01:14:51.532227 containerd[1715]: time="2026-01-15T01:14:51.531362598Z" level=info msg="containerd successfully booted in 0.266741s" Jan 15 01:14:51.532095 systemd[1]: Started containerd.service - containerd container runtime. Jan 15 01:14:51.553768 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Jan 15 01:14:51.557593 systemd[1]: Starting issuegen.service - Generate /run/issue... Jan 15 01:14:51.574902 systemd[1]: issuegen.service: Deactivated successfully. Jan 15 01:14:51.575137 systemd[1]: Finished issuegen.service - Generate /run/issue. Jan 15 01:14:51.577280 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Jan 15 01:14:51.600166 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Jan 15 01:14:51.604601 systemd[1]: Started getty@tty1.service - Getty on tty1. Jan 15 01:14:51.607449 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Jan 15 01:14:51.609091 systemd[1]: Reached target getty.target - Login Prompts. Jan 15 01:14:51.732808 tar[1694]: linux-amd64/README.md Jan 15 01:14:51.749905 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Jan 15 01:14:51.820329 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 15 01:14:52.209319 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 15 01:14:52.509616 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 15 01:14:52.529288 (kubelet)[1808]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 15 01:14:53.174946 kubelet[1808]: E0115 01:14:53.174886 1808 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 15 01:14:53.177323 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 15 01:14:53.177652 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 15 01:14:53.178356 systemd[1]: kubelet.service: Consumed 994ms CPU time, 263.2M memory peak. Jan 15 01:14:53.834372 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 15 01:14:54.219318 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 15 01:14:57.842322 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 15 01:14:57.848371 coreos-metadata[1667]: Jan 15 01:14:57.848 WARN failed to locate config-drive, using the metadata service API instead Jan 15 01:14:57.864236 coreos-metadata[1667]: Jan 15 01:14:57.864 INFO Fetching http://169.254.169.254/openstack/2012-08-10/meta_data.json: Attempt #1 Jan 15 01:14:58.231364 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 15 01:14:58.238202 coreos-metadata[1753]: Jan 15 01:14:58.238 WARN failed to locate config-drive, using the metadata service API instead Jan 15 01:14:58.250591 coreos-metadata[1753]: Jan 15 01:14:58.250 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 Jan 15 01:14:58.416278 coreos-metadata[1667]: Jan 15 01:14:58.416 INFO Fetch successful Jan 15 01:14:58.416495 coreos-metadata[1667]: Jan 15 01:14:58.416 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Jan 15 01:14:58.548574 coreos-metadata[1667]: Jan 15 01:14:58.548 INFO Fetch successful Jan 15 01:14:58.548574 coreos-metadata[1667]: Jan 15 01:14:58.548 INFO Fetching http://169.254.169.254/latest/meta-data/instance-id: Attempt #1 Jan 15 01:14:58.640632 coreos-metadata[1753]: Jan 15 01:14:58.640 INFO Fetch successful Jan 15 01:14:58.640632 coreos-metadata[1753]: Jan 15 01:14:58.640 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 Jan 15 01:14:58.794757 coreos-metadata[1667]: Jan 15 01:14:58.794 INFO Fetch successful Jan 15 01:14:58.794757 coreos-metadata[1667]: Jan 15 01:14:58.794 INFO Fetching http://169.254.169.254/latest/meta-data/instance-type: Attempt #1 Jan 15 01:14:58.889544 coreos-metadata[1753]: Jan 15 01:14:58.889 INFO Fetch successful Jan 15 01:14:58.891490 unknown[1753]: wrote ssh authorized keys file for user: core Jan 15 01:14:58.914184 update-ssh-keys[1826]: Updated "/home/core/.ssh/authorized_keys" Jan 15 01:14:58.914931 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Jan 15 01:14:58.916486 systemd[1]: Finished sshkeys.service. Jan 15 01:14:58.929028 coreos-metadata[1667]: Jan 15 01:14:58.928 INFO Fetch successful Jan 15 01:14:58.929028 coreos-metadata[1667]: Jan 15 01:14:58.929 INFO Fetching http://169.254.169.254/latest/meta-data/local-ipv4: Attempt #1 Jan 15 01:14:59.063587 coreos-metadata[1667]: Jan 15 01:14:59.063 INFO Fetch successful Jan 15 01:14:59.063587 coreos-metadata[1667]: Jan 15 01:14:59.063 INFO Fetching http://169.254.169.254/latest/meta-data/public-ipv4: Attempt #1 Jan 15 01:14:59.196273 coreos-metadata[1667]: Jan 15 01:14:59.195 INFO Fetch successful Jan 15 01:14:59.229668 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Jan 15 01:14:59.230118 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Jan 15 01:14:59.230247 systemd[1]: Reached target multi-user.target - Multi-User System. Jan 15 01:14:59.231051 systemd[1]: Startup finished in 3.734s (kernel) + 14.709s (initrd) + 11.864s (userspace) = 30.308s. Jan 15 01:15:03.428103 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Jan 15 01:15:03.430281 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 15 01:15:03.563595 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 15 01:15:03.573755 (kubelet)[1842]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 15 01:15:03.624948 kubelet[1842]: E0115 01:15:03.624892 1842 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 15 01:15:03.628372 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 15 01:15:03.628515 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 15 01:15:03.628851 systemd[1]: kubelet.service: Consumed 154ms CPU time, 109M memory peak. Jan 15 01:15:13.879369 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Jan 15 01:15:13.882374 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 15 01:15:14.018235 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 15 01:15:14.031674 (kubelet)[1859]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 15 01:15:14.075567 kubelet[1859]: E0115 01:15:14.075513 1859 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 15 01:15:14.077857 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 15 01:15:14.077984 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 15 01:15:14.078557 systemd[1]: kubelet.service: Consumed 144ms CPU time, 108M memory peak. Jan 15 01:15:14.736476 chronyd[1665]: Selected source PHC0 Jan 15 01:15:15.920272 systemd-resolved[1368]: Clock change detected. Flushing caches. Jan 15 01:15:14.736503 chronyd[1665]: System clock wrong by 1.183718 seconds Jan 15 01:15:15.921376 chronyd[1665]: System clock was stepped by 1.183718 seconds Jan 15 01:15:25.441525 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Jan 15 01:15:25.443336 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 15 01:15:25.584381 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 15 01:15:25.601384 (kubelet)[1873]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 15 01:15:25.643257 kubelet[1873]: E0115 01:15:25.643183 1873 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 15 01:15:25.645414 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 15 01:15:25.645653 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 15 01:15:25.646275 systemd[1]: kubelet.service: Consumed 141ms CPU time, 110.3M memory peak. Jan 15 01:15:35.691528 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Jan 15 01:15:35.693782 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 15 01:15:35.832048 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 15 01:15:35.846404 (kubelet)[1888]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 15 01:15:35.893196 kubelet[1888]: E0115 01:15:35.893114 1888 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 15 01:15:35.895343 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 15 01:15:35.895573 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 15 01:15:35.896246 systemd[1]: kubelet.service: Consumed 153ms CPU time, 110.4M memory peak. Jan 15 01:15:37.150983 update_engine[1684]: I20260115 01:15:37.150877 1684 update_attempter.cc:509] Updating boot flags... Jan 15 01:15:45.941628 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 5. Jan 15 01:15:45.943568 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 15 01:15:46.074950 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 15 01:15:46.090245 (kubelet)[1920]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 15 01:15:46.137173 kubelet[1920]: E0115 01:15:46.137113 1920 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 15 01:15:46.139557 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 15 01:15:46.139734 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 15 01:15:46.140403 systemd[1]: kubelet.service: Consumed 152ms CPU time, 110.5M memory peak. Jan 15 01:15:56.191503 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 6. Jan 15 01:15:56.193004 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 15 01:15:56.346093 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 15 01:15:56.359562 (kubelet)[1935]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 15 01:15:56.400635 kubelet[1935]: E0115 01:15:56.400593 1935 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 15 01:15:56.402752 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 15 01:15:56.402881 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 15 01:15:56.403471 systemd[1]: kubelet.service: Consumed 143ms CPU time, 108.2M memory peak. Jan 15 01:16:06.445064 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 7. Jan 15 01:16:06.447425 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 15 01:16:06.661556 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 15 01:16:06.675960 (kubelet)[1949]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 15 01:16:06.715389 kubelet[1949]: E0115 01:16:06.715248 1949 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 15 01:16:06.717761 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 15 01:16:06.717898 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 15 01:16:06.718559 systemd[1]: kubelet.service: Consumed 158ms CPU time, 110.2M memory peak. Jan 15 01:16:16.941685 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 8. Jan 15 01:16:16.943669 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 15 01:16:17.115530 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 15 01:16:17.131551 (kubelet)[1965]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 15 01:16:17.177729 kubelet[1965]: E0115 01:16:17.177684 1965 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 15 01:16:17.179871 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 15 01:16:17.180001 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 15 01:16:17.180711 systemd[1]: kubelet.service: Consumed 150ms CPU time, 108M memory peak. Jan 15 01:16:27.191477 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 9. Jan 15 01:16:27.193076 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 15 01:16:27.385027 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 15 01:16:27.396703 (kubelet)[1979]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 15 01:16:27.442333 kubelet[1979]: E0115 01:16:27.442208 1979 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 15 01:16:27.444627 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 15 01:16:27.444773 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 15 01:16:27.445449 systemd[1]: kubelet.service: Consumed 150ms CPU time, 107.5M memory peak. Jan 15 01:16:37.691927 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 10. Jan 15 01:16:37.694549 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 15 01:16:37.877919 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 15 01:16:37.886495 (kubelet)[1995]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 15 01:16:37.932696 kubelet[1995]: E0115 01:16:37.932622 1995 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 15 01:16:37.935043 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 15 01:16:37.935187 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 15 01:16:37.935821 systemd[1]: kubelet.service: Consumed 158ms CPU time, 107.8M memory peak. Jan 15 01:16:47.941505 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 11. Jan 15 01:16:47.945237 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 15 01:16:48.134089 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 15 01:16:48.142529 (kubelet)[2009]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 15 01:16:48.185684 kubelet[2009]: E0115 01:16:48.185600 2009 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 15 01:16:48.187847 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 15 01:16:48.187991 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 15 01:16:48.188611 systemd[1]: kubelet.service: Consumed 153ms CPU time, 108M memory peak. Jan 15 01:16:58.191502 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 12. Jan 15 01:16:58.193644 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 15 01:16:58.349436 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 15 01:16:58.357542 (kubelet)[2025]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 15 01:16:58.398008 kubelet[2025]: E0115 01:16:58.397931 2025 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 15 01:16:58.400431 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 15 01:16:58.400677 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 15 01:16:58.401350 systemd[1]: kubelet.service: Consumed 149ms CPU time, 110.3M memory peak. Jan 15 01:17:08.441746 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 13. Jan 15 01:17:08.444688 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 15 01:17:08.601843 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 15 01:17:08.615477 (kubelet)[2041]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 15 01:17:08.649132 kubelet[2041]: E0115 01:17:08.649088 2041 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 15 01:17:08.651352 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 15 01:17:08.651479 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 15 01:17:08.651963 systemd[1]: kubelet.service: Consumed 145ms CPU time, 110.2M memory peak. Jan 15 01:17:18.691648 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 14. Jan 15 01:17:18.694139 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 15 01:17:18.862173 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 15 01:17:18.872456 (kubelet)[2056]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 15 01:17:18.909561 kubelet[2056]: E0115 01:17:18.909471 2056 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 15 01:17:18.911812 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 15 01:17:18.911951 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 15 01:17:18.912813 systemd[1]: kubelet.service: Consumed 156ms CPU time, 108.5M memory peak. Jan 15 01:17:28.941718 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 15. Jan 15 01:17:28.943898 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 15 01:17:29.123628 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 15 01:17:29.133487 (kubelet)[2070]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 15 01:17:29.167200 kubelet[2070]: E0115 01:17:29.167132 2070 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 15 01:17:29.168525 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 15 01:17:29.168652 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 15 01:17:29.169178 systemd[1]: kubelet.service: Consumed 147ms CPU time, 110.2M memory peak. Jan 15 01:17:39.191521 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 16. Jan 15 01:17:39.193259 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 15 01:17:39.398632 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 15 01:17:39.405278 (kubelet)[2085]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 15 01:17:39.443758 kubelet[2085]: E0115 01:17:39.443664 2085 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 15 01:17:39.445429 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 15 01:17:39.445551 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 15 01:17:39.446112 systemd[1]: kubelet.service: Consumed 154ms CPU time, 108.3M memory peak. Jan 15 01:17:49.691514 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 17. Jan 15 01:17:49.693988 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 15 01:17:49.843537 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 15 01:17:49.851461 (kubelet)[2099]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 15 01:17:49.887786 kubelet[2099]: E0115 01:17:49.887712 2099 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 15 01:17:49.890963 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 15 01:17:49.891128 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 15 01:17:49.892384 systemd[1]: kubelet.service: Consumed 141ms CPU time, 108M memory peak. Jan 15 01:17:59.941688 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 18. Jan 15 01:17:59.943451 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 15 01:18:00.226140 (kubelet)[2113]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 15 01:18:00.228449 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 15 01:18:00.266822 kubelet[2113]: E0115 01:18:00.266776 2113 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 15 01:18:00.268525 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 15 01:18:00.268659 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 15 01:18:00.269191 systemd[1]: kubelet.service: Consumed 156ms CPU time, 109.9M memory peak. Jan 15 01:18:10.441650 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 19. Jan 15 01:18:10.443895 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 15 01:18:10.822848 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 15 01:18:10.829486 (kubelet)[2128]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 15 01:18:10.869870 kubelet[2128]: E0115 01:18:10.869820 2128 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 15 01:18:10.871723 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 15 01:18:10.871860 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 15 01:18:10.872234 systemd[1]: kubelet.service: Consumed 179ms CPU time, 112M memory peak. Jan 15 01:18:20.941558 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 20. Jan 15 01:18:20.942987 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 15 01:18:21.216545 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 15 01:18:21.227465 (kubelet)[2141]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 15 01:18:21.260468 kubelet[2141]: E0115 01:18:21.260428 2141 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 15 01:18:21.262373 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 15 01:18:21.262502 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 15 01:18:21.262841 systemd[1]: kubelet.service: Consumed 140ms CPU time, 110.3M memory peak. Jan 15 01:18:31.441681 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 21. Jan 15 01:18:31.443128 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 15 01:18:31.809790 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 15 01:18:31.823562 (kubelet)[2156]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 15 01:18:31.856606 kubelet[2156]: E0115 01:18:31.856566 2156 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 15 01:18:31.858146 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 15 01:18:31.858267 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 15 01:18:31.858584 systemd[1]: kubelet.service: Consumed 140ms CPU time, 108.2M memory peak. Jan 15 01:18:37.180687 update_engine[1684]: I20260115 01:18:37.179842 1684 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Jan 15 01:18:37.180687 update_engine[1684]: I20260115 01:18:37.179889 1684 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Jan 15 01:18:37.180687 update_engine[1684]: I20260115 01:18:37.180052 1684 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Jan 15 01:18:37.180687 update_engine[1684]: I20260115 01:18:37.180411 1684 omaha_request_params.cc:62] Current group set to beta Jan 15 01:18:37.180687 update_engine[1684]: I20260115 01:18:37.180500 1684 update_attempter.cc:499] Already updated boot flags. Skipping. Jan 15 01:18:37.180687 update_engine[1684]: I20260115 01:18:37.180506 1684 update_attempter.cc:643] Scheduling an action processor start. Jan 15 01:18:37.180687 update_engine[1684]: I20260115 01:18:37.180521 1684 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Jan 15 01:18:37.180687 update_engine[1684]: I20260115 01:18:37.180548 1684 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Jan 15 01:18:37.180687 update_engine[1684]: I20260115 01:18:37.180598 1684 omaha_request_action.cc:271] Posting an Omaha request to disabled Jan 15 01:18:37.180687 update_engine[1684]: I20260115 01:18:37.180603 1684 omaha_request_action.cc:272] Request: Jan 15 01:18:37.180687 update_engine[1684]: Jan 15 01:18:37.180687 update_engine[1684]: Jan 15 01:18:37.180687 update_engine[1684]: Jan 15 01:18:37.180687 update_engine[1684]: Jan 15 01:18:37.180687 update_engine[1684]: Jan 15 01:18:37.180687 update_engine[1684]: Jan 15 01:18:37.180687 update_engine[1684]: Jan 15 01:18:37.180687 update_engine[1684]: Jan 15 01:18:37.180687 update_engine[1684]: I20260115 01:18:37.180609 1684 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jan 15 01:18:37.181339 locksmithd[1724]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Jan 15 01:18:37.182217 update_engine[1684]: I20260115 01:18:37.182166 1684 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jan 15 01:18:37.182649 update_engine[1684]: I20260115 01:18:37.182610 1684 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jan 15 01:18:37.190127 update_engine[1684]: E20260115 01:18:37.190073 1684 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled (Domain name not found) Jan 15 01:18:37.190230 update_engine[1684]: I20260115 01:18:37.190151 1684 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Jan 15 01:18:41.941499 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 22. Jan 15 01:18:41.943154 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 15 01:18:42.290291 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 15 01:18:42.299615 (kubelet)[2171]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 15 01:18:42.340773 kubelet[2171]: E0115 01:18:42.340731 2171 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 15 01:18:42.343690 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 15 01:18:42.343820 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 15 01:18:42.344353 systemd[1]: kubelet.service: Consumed 139ms CPU time, 108.1M memory peak. Jan 15 01:18:47.110614 update_engine[1684]: I20260115 01:18:47.110539 1684 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jan 15 01:18:47.111506 update_engine[1684]: I20260115 01:18:47.111114 1684 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jan 15 01:18:47.111506 update_engine[1684]: I20260115 01:18:47.111465 1684 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jan 15 01:18:47.118726 update_engine[1684]: E20260115 01:18:47.118680 1684 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled (Domain name not found) Jan 15 01:18:47.118903 update_engine[1684]: I20260115 01:18:47.118888 1684 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Jan 15 01:18:52.441738 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 23. Jan 15 01:18:52.443927 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 15 01:18:52.810959 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 15 01:18:52.823359 (kubelet)[2186]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 15 01:18:52.855911 kubelet[2186]: E0115 01:18:52.855873 2186 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 15 01:18:52.858146 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 15 01:18:52.858271 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 15 01:18:52.858777 systemd[1]: kubelet.service: Consumed 138ms CPU time, 110.3M memory peak. Jan 15 01:18:57.116262 update_engine[1684]: I20260115 01:18:57.116143 1684 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jan 15 01:18:57.116262 update_engine[1684]: I20260115 01:18:57.116253 1684 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jan 15 01:18:57.116673 update_engine[1684]: I20260115 01:18:57.116629 1684 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jan 15 01:18:57.122944 update_engine[1684]: E20260115 01:18:57.122874 1684 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled (Domain name not found) Jan 15 01:18:57.123096 update_engine[1684]: I20260115 01:18:57.122967 1684 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 Jan 15 01:19:02.941883 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 24. Jan 15 01:19:02.944579 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 15 01:19:03.232523 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 15 01:19:03.244525 (kubelet)[2200]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 15 01:19:03.279282 kubelet[2200]: E0115 01:19:03.279241 2200 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 15 01:19:03.280999 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 15 01:19:03.281133 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 15 01:19:03.281483 systemd[1]: kubelet.service: Consumed 141ms CPU time, 107.7M memory peak. Jan 15 01:19:07.116516 update_engine[1684]: I20260115 01:19:07.116068 1684 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jan 15 01:19:07.116516 update_engine[1684]: I20260115 01:19:07.116175 1684 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jan 15 01:19:07.116516 update_engine[1684]: I20260115 01:19:07.116472 1684 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jan 15 01:19:07.122980 update_engine[1684]: E20260115 01:19:07.122364 1684 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled (Domain name not found) Jan 15 01:19:07.122980 update_engine[1684]: I20260115 01:19:07.122452 1684 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Jan 15 01:19:07.122980 update_engine[1684]: I20260115 01:19:07.122460 1684 omaha_request_action.cc:617] Omaha request response: Jan 15 01:19:07.122980 update_engine[1684]: E20260115 01:19:07.122532 1684 omaha_request_action.cc:636] Omaha request network transfer failed. Jan 15 01:19:07.122980 update_engine[1684]: I20260115 01:19:07.122550 1684 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. Jan 15 01:19:07.122980 update_engine[1684]: I20260115 01:19:07.122555 1684 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Jan 15 01:19:07.122980 update_engine[1684]: I20260115 01:19:07.122560 1684 update_attempter.cc:306] Processing Done. Jan 15 01:19:07.122980 update_engine[1684]: E20260115 01:19:07.122571 1684 update_attempter.cc:619] Update failed. Jan 15 01:19:07.122980 update_engine[1684]: I20260115 01:19:07.122576 1684 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse Jan 15 01:19:07.122980 update_engine[1684]: I20260115 01:19:07.122581 1684 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) Jan 15 01:19:07.122980 update_engine[1684]: I20260115 01:19:07.122586 1684 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. Jan 15 01:19:07.122980 update_engine[1684]: I20260115 01:19:07.122651 1684 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Jan 15 01:19:07.122980 update_engine[1684]: I20260115 01:19:07.122680 1684 omaha_request_action.cc:271] Posting an Omaha request to disabled Jan 15 01:19:07.122980 update_engine[1684]: I20260115 01:19:07.122686 1684 omaha_request_action.cc:272] Request: Jan 15 01:19:07.122980 update_engine[1684]: Jan 15 01:19:07.122980 update_engine[1684]: Jan 15 01:19:07.123376 update_engine[1684]: Jan 15 01:19:07.123376 update_engine[1684]: Jan 15 01:19:07.123376 update_engine[1684]: Jan 15 01:19:07.123376 update_engine[1684]: Jan 15 01:19:07.123376 update_engine[1684]: I20260115 01:19:07.122691 1684 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jan 15 01:19:07.123376 update_engine[1684]: I20260115 01:19:07.122711 1684 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jan 15 01:19:07.123376 update_engine[1684]: I20260115 01:19:07.122947 1684 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jan 15 01:19:07.123506 locksmithd[1724]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 Jan 15 01:19:07.130925 update_engine[1684]: E20260115 01:19:07.130697 1684 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled (Domain name not found) Jan 15 01:19:07.130925 update_engine[1684]: I20260115 01:19:07.130790 1684 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Jan 15 01:19:07.130925 update_engine[1684]: I20260115 01:19:07.130800 1684 omaha_request_action.cc:617] Omaha request response: Jan 15 01:19:07.130925 update_engine[1684]: I20260115 01:19:07.130806 1684 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Jan 15 01:19:07.130925 update_engine[1684]: I20260115 01:19:07.130810 1684 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Jan 15 01:19:07.130925 update_engine[1684]: I20260115 01:19:07.130815 1684 update_attempter.cc:306] Processing Done. Jan 15 01:19:07.130925 update_engine[1684]: I20260115 01:19:07.130820 1684 update_attempter.cc:310] Error event sent. Jan 15 01:19:07.130925 update_engine[1684]: I20260115 01:19:07.130836 1684 update_check_scheduler.cc:74] Next update check in 41m43s Jan 15 01:19:07.131226 locksmithd[1724]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 Jan 15 01:19:13.442162 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 25. Jan 15 01:19:13.444984 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 15 01:19:13.625742 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 15 01:19:13.634292 (kubelet)[2215]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 15 01:19:13.667493 kubelet[2215]: E0115 01:19:13.667454 2215 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 15 01:19:13.669246 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 15 01:19:13.669366 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 15 01:19:13.669813 systemd[1]: kubelet.service: Consumed 140ms CPU time, 110.3M memory peak. Jan 15 01:19:20.686961 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Jan 15 01:19:20.688043 systemd[1]: Started sshd@0-10.0.7.78:22-4.153.228.146:55416.service - OpenSSH per-connection server daemon (4.153.228.146:55416). Jan 15 01:19:21.251726 sshd[2223]: Accepted publickey for core from 4.153.228.146 port 55416 ssh2: RSA SHA256:W3NJ9iaq4C9Dl4EkQ4ZEj/H9NTPoRStkhopGCs4knf0 Jan 15 01:19:21.252985 sshd-session[2223]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 01:19:21.259089 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Jan 15 01:19:21.260134 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Jan 15 01:19:21.265586 systemd-logind[1682]: New session 1 of user core. Jan 15 01:19:21.280493 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Jan 15 01:19:21.283149 systemd[1]: Starting user@500.service - User Manager for UID 500... Jan 15 01:19:21.305643 (systemd)[2228]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Jan 15 01:19:21.307987 systemd-logind[1682]: New session c1 of user core. Jan 15 01:19:21.439146 systemd[2228]: Queued start job for default target default.target. Jan 15 01:19:21.452780 systemd[2228]: Created slice app.slice - User Application Slice. Jan 15 01:19:21.452815 systemd[2228]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of User's Temporary Directories. Jan 15 01:19:21.452828 systemd[2228]: Reached target paths.target - Paths. Jan 15 01:19:21.452879 systemd[2228]: Reached target timers.target - Timers. Jan 15 01:19:21.454392 systemd[2228]: Starting dbus.socket - D-Bus User Message Bus Socket... Jan 15 01:19:21.459191 systemd[2228]: Starting systemd-tmpfiles-setup.service - Create User Files and Directories... Jan 15 01:19:21.466677 systemd[2228]: Listening on dbus.socket - D-Bus User Message Bus Socket. Jan 15 01:19:21.466744 systemd[2228]: Reached target sockets.target - Sockets. Jan 15 01:19:21.473903 systemd[2228]: Finished systemd-tmpfiles-setup.service - Create User Files and Directories. Jan 15 01:19:21.474043 systemd[2228]: Reached target basic.target - Basic System. Jan 15 01:19:21.474103 systemd[2228]: Reached target default.target - Main User Target. Jan 15 01:19:21.474129 systemd[2228]: Startup finished in 160ms. Jan 15 01:19:21.474367 systemd[1]: Started user@500.service - User Manager for UID 500. Jan 15 01:19:21.478257 systemd[1]: Started session-1.scope - Session 1 of User core. Jan 15 01:19:22.804271 systemd[1]: Started sshd@1-10.0.7.78:22-4.153.228.146:55424.service - OpenSSH per-connection server daemon (4.153.228.146:55424). Jan 15 01:19:23.323855 sshd[2241]: Accepted publickey for core from 4.153.228.146 port 55424 ssh2: RSA SHA256:W3NJ9iaq4C9Dl4EkQ4ZEj/H9NTPoRStkhopGCs4knf0 Jan 15 01:19:23.324969 sshd-session[2241]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 01:19:23.329144 systemd-logind[1682]: New session 2 of user core. Jan 15 01:19:23.339205 systemd[1]: Started session-2.scope - Session 2 of User core. Jan 15 01:19:23.617700 sshd[2244]: Connection closed by 4.153.228.146 port 55424 Jan 15 01:19:23.619330 sshd-session[2241]: pam_unix(sshd:session): session closed for user core Jan 15 01:19:23.623937 systemd[1]: sshd@1-10.0.7.78:22-4.153.228.146:55424.service: Deactivated successfully. Jan 15 01:19:23.627480 systemd[1]: session-2.scope: Deactivated successfully. Jan 15 01:19:23.629733 systemd-logind[1682]: Session 2 logged out. Waiting for processes to exit. Jan 15 01:19:23.630663 systemd-logind[1682]: Removed session 2. Jan 15 01:19:23.691642 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 26. Jan 15 01:19:23.693202 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 15 01:19:23.725297 systemd[1]: Started sshd@2-10.0.7.78:22-4.153.228.146:55440.service - OpenSSH per-connection server daemon (4.153.228.146:55440). Jan 15 01:19:23.976329 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 15 01:19:23.979388 (kubelet)[2261]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 15 01:19:24.033814 kubelet[2261]: E0115 01:19:24.033766 2261 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 15 01:19:24.035923 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 15 01:19:24.036199 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 15 01:19:24.036532 systemd[1]: kubelet.service: Consumed 139ms CPU time, 108.4M memory peak. Jan 15 01:19:24.264655 sshd[2253]: Accepted publickey for core from 4.153.228.146 port 55440 ssh2: RSA SHA256:W3NJ9iaq4C9Dl4EkQ4ZEj/H9NTPoRStkhopGCs4knf0 Jan 15 01:19:24.266416 sshd-session[2253]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 01:19:24.271033 systemd-logind[1682]: New session 3 of user core. Jan 15 01:19:24.280251 systemd[1]: Started session-3.scope - Session 3 of User core. Jan 15 01:19:24.564486 sshd[2267]: Connection closed by 4.153.228.146 port 55440 Jan 15 01:19:24.564877 sshd-session[2253]: pam_unix(sshd:session): session closed for user core Jan 15 01:19:24.569366 systemd-logind[1682]: Session 3 logged out. Waiting for processes to exit. Jan 15 01:19:24.569534 systemd[1]: sshd@2-10.0.7.78:22-4.153.228.146:55440.service: Deactivated successfully. Jan 15 01:19:24.570906 systemd[1]: session-3.scope: Deactivated successfully. Jan 15 01:19:24.572568 systemd-logind[1682]: Removed session 3. Jan 15 01:19:24.674204 systemd[1]: Started sshd@3-10.0.7.78:22-4.153.228.146:54904.service - OpenSSH per-connection server daemon (4.153.228.146:54904). Jan 15 01:19:25.201611 sshd[2273]: Accepted publickey for core from 4.153.228.146 port 54904 ssh2: RSA SHA256:W3NJ9iaq4C9Dl4EkQ4ZEj/H9NTPoRStkhopGCs4knf0 Jan 15 01:19:25.203029 sshd-session[2273]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 01:19:25.209443 systemd-logind[1682]: New session 4 of user core. Jan 15 01:19:25.217431 systemd[1]: Started session-4.scope - Session 4 of User core. Jan 15 01:19:25.500596 sshd[2276]: Connection closed by 4.153.228.146 port 54904 Jan 15 01:19:25.502252 sshd-session[2273]: pam_unix(sshd:session): session closed for user core Jan 15 01:19:25.505518 systemd[1]: sshd@3-10.0.7.78:22-4.153.228.146:54904.service: Deactivated successfully. Jan 15 01:19:25.507578 systemd[1]: session-4.scope: Deactivated successfully. Jan 15 01:19:25.508318 systemd-logind[1682]: Session 4 logged out. Waiting for processes to exit. Jan 15 01:19:25.509504 systemd-logind[1682]: Removed session 4. Jan 15 01:19:25.606040 systemd[1]: Started sshd@4-10.0.7.78:22-4.153.228.146:54910.service - OpenSSH per-connection server daemon (4.153.228.146:54910). Jan 15 01:19:26.130826 sshd[2282]: Accepted publickey for core from 4.153.228.146 port 54910 ssh2: RSA SHA256:W3NJ9iaq4C9Dl4EkQ4ZEj/H9NTPoRStkhopGCs4knf0 Jan 15 01:19:26.131965 sshd-session[2282]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 01:19:26.136434 systemd-logind[1682]: New session 5 of user core. Jan 15 01:19:26.145241 systemd[1]: Started session-5.scope - Session 5 of User core. Jan 15 01:19:26.359913 sudo[2286]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Jan 15 01:19:26.360237 sudo[2286]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 15 01:19:26.379173 sudo[2286]: pam_unix(sudo:session): session closed for user root Jan 15 01:19:26.476064 sshd[2285]: Connection closed by 4.153.228.146 port 54910 Jan 15 01:19:26.477236 sshd-session[2282]: pam_unix(sshd:session): session closed for user core Jan 15 01:19:26.483336 systemd[1]: sshd@4-10.0.7.78:22-4.153.228.146:54910.service: Deactivated successfully. Jan 15 01:19:26.486740 systemd[1]: session-5.scope: Deactivated successfully. Jan 15 01:19:26.488193 systemd-logind[1682]: Session 5 logged out. Waiting for processes to exit. Jan 15 01:19:26.490614 systemd-logind[1682]: Removed session 5. Jan 15 01:19:26.585542 systemd[1]: Started sshd@5-10.0.7.78:22-4.153.228.146:54922.service - OpenSSH per-connection server daemon (4.153.228.146:54922). Jan 15 01:19:27.127537 sshd[2292]: Accepted publickey for core from 4.153.228.146 port 54922 ssh2: RSA SHA256:W3NJ9iaq4C9Dl4EkQ4ZEj/H9NTPoRStkhopGCs4knf0 Jan 15 01:19:27.129402 sshd-session[2292]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 01:19:27.134595 systemd-logind[1682]: New session 6 of user core. Jan 15 01:19:27.140343 systemd[1]: Started session-6.scope - Session 6 of User core. Jan 15 01:19:27.325963 sudo[2297]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Jan 15 01:19:27.326220 sudo[2297]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 15 01:19:27.332035 sudo[2297]: pam_unix(sudo:session): session closed for user root Jan 15 01:19:27.338110 sudo[2296]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Jan 15 01:19:27.338339 sudo[2296]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 15 01:19:27.347828 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 15 01:19:27.386794 kernel: kauditd_printk_skb: 186 callbacks suppressed Jan 15 01:19:27.386918 kernel: audit: type=1305 audit(1768439967.383:233): auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Jan 15 01:19:27.383000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Jan 15 01:19:27.388116 kernel: audit: type=1300 audit(1768439967.383:233): arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7ffcb6b13690 a2=420 a3=0 items=0 ppid=2300 pid=2319 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:27.383000 audit[2319]: SYSCALL arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7ffcb6b13690 a2=420 a3=0 items=0 ppid=2300 pid=2319 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:27.390120 augenrules[2319]: No rules Jan 15 01:19:27.383000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 15 01:19:27.391925 systemd[1]: audit-rules.service: Deactivated successfully. Jan 15 01:19:27.392042 kernel: audit: type=1327 audit(1768439967.383:233): proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 15 01:19:27.392165 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 15 01:19:27.393280 sudo[2296]: pam_unix(sudo:session): session closed for user root Jan 15 01:19:27.391000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:19:27.397097 kernel: audit: type=1130 audit(1768439967.391:234): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:19:27.391000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:19:27.392000 audit[2296]: USER_END pid=2296 uid=500 auid=500 ses=6 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_unix,pam_permit,pam_systemd acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 15 01:19:27.401453 kernel: audit: type=1131 audit(1768439967.391:235): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:19:27.401507 kernel: audit: type=1106 audit(1768439967.392:236): pid=2296 uid=500 auid=500 ses=6 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_unix,pam_permit,pam_systemd acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 15 01:19:27.401531 kernel: audit: type=1104 audit(1768439967.392:237): pid=2296 uid=500 auid=500 ses=6 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 15 01:19:27.392000 audit[2296]: CRED_DISP pid=2296 uid=500 auid=500 ses=6 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 15 01:19:27.492336 sshd[2295]: Connection closed by 4.153.228.146 port 54922 Jan 15 01:19:27.492820 sshd-session[2292]: pam_unix(sshd:session): session closed for user core Jan 15 01:19:27.493000 audit[2292]: USER_END pid=2292 uid=0 auid=500 ses=6 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 01:19:27.496517 systemd[1]: sshd@5-10.0.7.78:22-4.153.228.146:54922.service: Deactivated successfully. Jan 15 01:19:27.493000 audit[2292]: CRED_DISP pid=2292 uid=0 auid=500 ses=6 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 01:19:27.498816 systemd[1]: session-6.scope: Deactivated successfully. Jan 15 01:19:27.499399 kernel: audit: type=1106 audit(1768439967.493:238): pid=2292 uid=0 auid=500 ses=6 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 01:19:27.499441 kernel: audit: type=1104 audit(1768439967.493:239): pid=2292 uid=0 auid=500 ses=6 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 01:19:27.495000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@5-10.0.7.78:22-4.153.228.146:54922 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:19:27.501776 kernel: audit: type=1131 audit(1768439967.495:240): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@5-10.0.7.78:22-4.153.228.146:54922 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:19:27.501840 systemd-logind[1682]: Session 6 logged out. Waiting for processes to exit. Jan 15 01:19:27.503056 systemd-logind[1682]: Removed session 6. Jan 15 01:19:27.615429 systemd[1]: Started sshd@6-10.0.7.78:22-4.153.228.146:54936.service - OpenSSH per-connection server daemon (4.153.228.146:54936). Jan 15 01:19:27.614000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@6-10.0.7.78:22-4.153.228.146:54936 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:19:28.149000 audit[2328]: USER_ACCT pid=2328 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 01:19:28.150421 sshd[2328]: Accepted publickey for core from 4.153.228.146 port 54936 ssh2: RSA SHA256:W3NJ9iaq4C9Dl4EkQ4ZEj/H9NTPoRStkhopGCs4knf0 Jan 15 01:19:28.151000 audit[2328]: CRED_ACQ pid=2328 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 01:19:28.151000 audit[2328]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff4269fbf0 a2=3 a3=0 items=0 ppid=1 pid=2328 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=7 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:28.151000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 15 01:19:28.152543 sshd-session[2328]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 01:19:28.158653 systemd-logind[1682]: New session 7 of user core. Jan 15 01:19:28.160822 systemd[1]: Started session-7.scope - Session 7 of User core. Jan 15 01:19:28.162000 audit[2328]: USER_START pid=2328 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 01:19:28.164000 audit[2331]: CRED_ACQ pid=2331 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 01:19:28.348000 audit[2332]: USER_ACCT pid=2332 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 15 01:19:28.348000 audit[2332]: CRED_REFR pid=2332 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 15 01:19:28.349361 sudo[2332]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Jan 15 01:19:28.349590 sudo[2332]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 15 01:19:28.350000 audit[2332]: USER_START pid=2332 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_limits,pam_env,pam_unix,pam_permit,pam_systemd acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 15 01:19:28.792267 systemd[1]: Starting docker.service - Docker Application Container Engine... Jan 15 01:19:28.813529 (dockerd)[2351]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Jan 15 01:19:29.149402 dockerd[2351]: time="2026-01-15T01:19:29.149251871Z" level=info msg="Starting up" Jan 15 01:19:29.150104 dockerd[2351]: time="2026-01-15T01:19:29.150050372Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Jan 15 01:19:29.162945 dockerd[2351]: time="2026-01-15T01:19:29.162843795Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Jan 15 01:19:29.235580 dockerd[2351]: time="2026-01-15T01:19:29.235369244Z" level=info msg="Loading containers: start." Jan 15 01:19:29.246034 kernel: Initializing XFRM netlink socket Jan 15 01:19:29.321000 audit[2400]: NETFILTER_CFG table=nat:2 family=2 entries=2 op=nft_register_chain pid=2400 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 01:19:29.321000 audit[2400]: SYSCALL arch=c000003e syscall=46 success=yes exit=116 a0=3 a1=7ffc11cd93e0 a2=0 a3=0 items=0 ppid=2351 pid=2400 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:29.321000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Jan 15 01:19:29.323000 audit[2402]: NETFILTER_CFG table=filter:3 family=2 entries=2 op=nft_register_chain pid=2402 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 01:19:29.323000 audit[2402]: SYSCALL arch=c000003e syscall=46 success=yes exit=124 a0=3 a1=7ffe32a928e0 a2=0 a3=0 items=0 ppid=2351 pid=2402 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:29.323000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Jan 15 01:19:29.325000 audit[2404]: NETFILTER_CFG table=filter:4 family=2 entries=1 op=nft_register_chain pid=2404 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 01:19:29.325000 audit[2404]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffd1821cb60 a2=0 a3=0 items=0 ppid=2351 pid=2404 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:29.325000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D464F5257415244 Jan 15 01:19:29.328000 audit[2406]: NETFILTER_CFG table=filter:5 family=2 entries=1 op=nft_register_chain pid=2406 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 01:19:29.328000 audit[2406]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffdd500e640 a2=0 a3=0 items=0 ppid=2351 pid=2406 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:29.328000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D425249444745 Jan 15 01:19:29.330000 audit[2408]: NETFILTER_CFG table=filter:6 family=2 entries=1 op=nft_register_chain pid=2408 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 01:19:29.330000 audit[2408]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffe41eb94d0 a2=0 a3=0 items=0 ppid=2351 pid=2408 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:29.330000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D4354 Jan 15 01:19:29.332000 audit[2410]: NETFILTER_CFG table=filter:7 family=2 entries=1 op=nft_register_chain pid=2410 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 01:19:29.332000 audit[2410]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7ffca3486af0 a2=0 a3=0 items=0 ppid=2351 pid=2410 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:29.332000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 15 01:19:29.334000 audit[2412]: NETFILTER_CFG table=filter:8 family=2 entries=1 op=nft_register_chain pid=2412 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 01:19:29.334000 audit[2412]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7ffe95c590b0 a2=0 a3=0 items=0 ppid=2351 pid=2412 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:29.334000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 15 01:19:29.336000 audit[2414]: NETFILTER_CFG table=nat:9 family=2 entries=2 op=nft_register_chain pid=2414 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 01:19:29.336000 audit[2414]: SYSCALL arch=c000003e syscall=46 success=yes exit=384 a0=3 a1=7ffd123812c0 a2=0 a3=0 items=0 ppid=2351 pid=2414 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:29.336000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Jan 15 01:19:29.374000 audit[2417]: NETFILTER_CFG table=nat:10 family=2 entries=2 op=nft_register_chain pid=2417 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 01:19:29.374000 audit[2417]: SYSCALL arch=c000003e syscall=46 success=yes exit=472 a0=3 a1=7ffe08378c70 a2=0 a3=0 items=0 ppid=2351 pid=2417 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:29.374000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003132372E302E302E302F38 Jan 15 01:19:29.376000 audit[2419]: NETFILTER_CFG table=filter:11 family=2 entries=2 op=nft_register_chain pid=2419 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 01:19:29.376000 audit[2419]: SYSCALL arch=c000003e syscall=46 success=yes exit=340 a0=3 a1=7ffc47969830 a2=0 a3=0 items=0 ppid=2351 pid=2419 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:29.376000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D464F5257415244 Jan 15 01:19:29.379000 audit[2421]: NETFILTER_CFG table=filter:12 family=2 entries=1 op=nft_register_rule pid=2421 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 01:19:29.379000 audit[2421]: SYSCALL arch=c000003e syscall=46 success=yes exit=236 a0=3 a1=7ffedfeb6a10 a2=0 a3=0 items=0 ppid=2351 pid=2421 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:29.379000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D425249444745 Jan 15 01:19:29.381000 audit[2423]: NETFILTER_CFG table=filter:13 family=2 entries=1 op=nft_register_rule pid=2423 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 01:19:29.381000 audit[2423]: SYSCALL arch=c000003e syscall=46 success=yes exit=248 a0=3 a1=7ffd5bf360f0 a2=0 a3=0 items=0 ppid=2351 pid=2423 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:29.381000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 15 01:19:29.383000 audit[2425]: NETFILTER_CFG table=filter:14 family=2 entries=1 op=nft_register_rule pid=2425 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 01:19:29.383000 audit[2425]: SYSCALL arch=c000003e syscall=46 success=yes exit=232 a0=3 a1=7ffd7f716480 a2=0 a3=0 items=0 ppid=2351 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:29.383000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D4354 Jan 15 01:19:29.424000 audit[2455]: NETFILTER_CFG table=nat:15 family=10 entries=2 op=nft_register_chain pid=2455 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 01:19:29.424000 audit[2455]: SYSCALL arch=c000003e syscall=46 success=yes exit=116 a0=3 a1=7ffd383c6030 a2=0 a3=0 items=0 ppid=2351 pid=2455 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:29.424000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Jan 15 01:19:29.426000 audit[2457]: NETFILTER_CFG table=filter:16 family=10 entries=2 op=nft_register_chain pid=2457 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 01:19:29.426000 audit[2457]: SYSCALL arch=c000003e syscall=46 success=yes exit=124 a0=3 a1=7ffc7a404000 a2=0 a3=0 items=0 ppid=2351 pid=2457 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:29.426000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Jan 15 01:19:29.428000 audit[2459]: NETFILTER_CFG table=filter:17 family=10 entries=1 op=nft_register_chain pid=2459 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 01:19:29.428000 audit[2459]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fff2b7831a0 a2=0 a3=0 items=0 ppid=2351 pid=2459 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:29.428000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D464F5257415244 Jan 15 01:19:29.430000 audit[2461]: NETFILTER_CFG table=filter:18 family=10 entries=1 op=nft_register_chain pid=2461 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 01:19:29.430000 audit[2461]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffd3d749260 a2=0 a3=0 items=0 ppid=2351 pid=2461 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:29.430000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D425249444745 Jan 15 01:19:29.432000 audit[2463]: NETFILTER_CFG table=filter:19 family=10 entries=1 op=nft_register_chain pid=2463 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 01:19:29.432000 audit[2463]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffd765049e0 a2=0 a3=0 items=0 ppid=2351 pid=2463 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:29.432000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D4354 Jan 15 01:19:29.434000 audit[2465]: NETFILTER_CFG table=filter:20 family=10 entries=1 op=nft_register_chain pid=2465 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 01:19:29.434000 audit[2465]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7ffe5f53beb0 a2=0 a3=0 items=0 ppid=2351 pid=2465 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:29.434000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 15 01:19:29.436000 audit[2467]: NETFILTER_CFG table=filter:21 family=10 entries=1 op=nft_register_chain pid=2467 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 01:19:29.436000 audit[2467]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7ffc98fbba90 a2=0 a3=0 items=0 ppid=2351 pid=2467 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:29.436000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 15 01:19:29.438000 audit[2469]: NETFILTER_CFG table=nat:22 family=10 entries=2 op=nft_register_chain pid=2469 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 01:19:29.438000 audit[2469]: SYSCALL arch=c000003e syscall=46 success=yes exit=384 a0=3 a1=7ffc95bd13e0 a2=0 a3=0 items=0 ppid=2351 pid=2469 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:29.438000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Jan 15 01:19:29.441000 audit[2471]: NETFILTER_CFG table=nat:23 family=10 entries=2 op=nft_register_chain pid=2471 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 01:19:29.441000 audit[2471]: SYSCALL arch=c000003e syscall=46 success=yes exit=484 a0=3 a1=7ffc0fcb64f0 a2=0 a3=0 items=0 ppid=2351 pid=2471 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:29.441000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003A3A312F313238 Jan 15 01:19:29.443000 audit[2473]: NETFILTER_CFG table=filter:24 family=10 entries=2 op=nft_register_chain pid=2473 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 01:19:29.443000 audit[2473]: SYSCALL arch=c000003e syscall=46 success=yes exit=340 a0=3 a1=7fffc56a0110 a2=0 a3=0 items=0 ppid=2351 pid=2473 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:29.443000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D464F5257415244 Jan 15 01:19:29.445000 audit[2475]: NETFILTER_CFG table=filter:25 family=10 entries=1 op=nft_register_rule pid=2475 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 01:19:29.445000 audit[2475]: SYSCALL arch=c000003e syscall=46 success=yes exit=236 a0=3 a1=7ffdd1966e00 a2=0 a3=0 items=0 ppid=2351 pid=2475 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:29.445000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D425249444745 Jan 15 01:19:29.446000 audit[2477]: NETFILTER_CFG table=filter:26 family=10 entries=1 op=nft_register_rule pid=2477 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 01:19:29.446000 audit[2477]: SYSCALL arch=c000003e syscall=46 success=yes exit=248 a0=3 a1=7ffd877e02a0 a2=0 a3=0 items=0 ppid=2351 pid=2477 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:29.446000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 15 01:19:29.448000 audit[2479]: NETFILTER_CFG table=filter:27 family=10 entries=1 op=nft_register_rule pid=2479 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 01:19:29.448000 audit[2479]: SYSCALL arch=c000003e syscall=46 success=yes exit=232 a0=3 a1=7ffe69961060 a2=0 a3=0 items=0 ppid=2351 pid=2479 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:29.448000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D4354 Jan 15 01:19:29.453000 audit[2484]: NETFILTER_CFG table=filter:28 family=2 entries=1 op=nft_register_chain pid=2484 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 01:19:29.453000 audit[2484]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffd7550dc40 a2=0 a3=0 items=0 ppid=2351 pid=2484 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:29.453000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Jan 15 01:19:29.456000 audit[2486]: NETFILTER_CFG table=filter:29 family=2 entries=1 op=nft_register_rule pid=2486 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 01:19:29.456000 audit[2486]: SYSCALL arch=c000003e syscall=46 success=yes exit=212 a0=3 a1=7ffd6d6923f0 a2=0 a3=0 items=0 ppid=2351 pid=2486 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:29.456000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Jan 15 01:19:29.457000 audit[2488]: NETFILTER_CFG table=filter:30 family=2 entries=1 op=nft_register_rule pid=2488 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 01:19:29.457000 audit[2488]: SYSCALL arch=c000003e syscall=46 success=yes exit=224 a0=3 a1=7fff9396af30 a2=0 a3=0 items=0 ppid=2351 pid=2488 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:29.457000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Jan 15 01:19:29.459000 audit[2490]: NETFILTER_CFG table=filter:31 family=10 entries=1 op=nft_register_chain pid=2490 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 01:19:29.459000 audit[2490]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffc4a359b10 a2=0 a3=0 items=0 ppid=2351 pid=2490 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:29.459000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Jan 15 01:19:29.462000 audit[2492]: NETFILTER_CFG table=filter:32 family=10 entries=1 op=nft_register_rule pid=2492 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 01:19:29.462000 audit[2492]: SYSCALL arch=c000003e syscall=46 success=yes exit=212 a0=3 a1=7fff416c23e0 a2=0 a3=0 items=0 ppid=2351 pid=2492 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:29.462000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Jan 15 01:19:29.463000 audit[2494]: NETFILTER_CFG table=filter:33 family=10 entries=1 op=nft_register_rule pid=2494 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 01:19:29.463000 audit[2494]: SYSCALL arch=c000003e syscall=46 success=yes exit=224 a0=3 a1=7ffe6530d010 a2=0 a3=0 items=0 ppid=2351 pid=2494 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:29.463000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Jan 15 01:19:29.497000 audit[2499]: NETFILTER_CFG table=nat:34 family=2 entries=2 op=nft_register_chain pid=2499 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 01:19:29.497000 audit[2499]: SYSCALL arch=c000003e syscall=46 success=yes exit=520 a0=3 a1=7ffec0a994d0 a2=0 a3=0 items=0 ppid=2351 pid=2499 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:29.497000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4900504F5354524F5554494E47002D73003137322E31372E302E302F31360000002D6F00646F636B657230002D6A004D415351554552414445 Jan 15 01:19:29.501000 audit[2501]: NETFILTER_CFG table=nat:35 family=2 entries=1 op=nft_register_rule pid=2501 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 01:19:29.501000 audit[2501]: SYSCALL arch=c000003e syscall=46 success=yes exit=288 a0=3 a1=7ffc5a64bcc0 a2=0 a3=0 items=0 ppid=2351 pid=2501 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:29.501000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4900444F434B4552002D6900646F636B657230002D6A0052455455524E Jan 15 01:19:29.513000 audit[2509]: NETFILTER_CFG table=filter:36 family=2 entries=1 op=nft_register_rule pid=2509 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 01:19:29.513000 audit[2509]: SYSCALL arch=c000003e syscall=46 success=yes exit=300 a0=3 a1=7ffc1842d000 a2=0 a3=0 items=0 ppid=2351 pid=2509 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:29.513000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D464F5257415244002D6900646F636B657230002D6A00414343455054 Jan 15 01:19:29.526000 audit[2515]: NETFILTER_CFG table=filter:37 family=2 entries=1 op=nft_register_rule pid=2515 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 01:19:29.526000 audit[2515]: SYSCALL arch=c000003e syscall=46 success=yes exit=376 a0=3 a1=7fff2090f390 a2=0 a3=0 items=0 ppid=2351 pid=2515 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:29.526000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45520000002D6900646F636B657230002D6F00646F636B657230002D6A0044524F50 Jan 15 01:19:29.528000 audit[2517]: NETFILTER_CFG table=filter:38 family=2 entries=1 op=nft_register_rule pid=2517 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 01:19:29.528000 audit[2517]: SYSCALL arch=c000003e syscall=46 success=yes exit=512 a0=3 a1=7ffe683594a0 a2=0 a3=0 items=0 ppid=2351 pid=2517 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:29.528000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D4354002D6F00646F636B657230002D6D00636F6E6E747261636B002D2D637473746174650052454C415445442C45535441424C4953484544002D6A00414343455054 Jan 15 01:19:29.531000 audit[2519]: NETFILTER_CFG table=filter:39 family=2 entries=1 op=nft_register_rule pid=2519 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 01:19:29.531000 audit[2519]: SYSCALL arch=c000003e syscall=46 success=yes exit=312 a0=3 a1=7ffe85ccad00 a2=0 a3=0 items=0 ppid=2351 pid=2519 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:29.531000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D425249444745002D6F00646F636B657230002D6A00444F434B4552 Jan 15 01:19:29.533000 audit[2521]: NETFILTER_CFG table=filter:40 family=2 entries=1 op=nft_register_rule pid=2521 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 01:19:29.533000 audit[2521]: SYSCALL arch=c000003e syscall=46 success=yes exit=428 a0=3 a1=7ffc8b5d53f0 a2=0 a3=0 items=0 ppid=2351 pid=2521 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:29.533000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D49534F4C4154494F4E2D53544147452D31002D6900646F636B6572300000002D6F00646F636B657230002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 15 01:19:29.536000 audit[2523]: NETFILTER_CFG table=filter:41 family=2 entries=1 op=nft_register_rule pid=2523 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 01:19:29.536000 audit[2523]: SYSCALL arch=c000003e syscall=46 success=yes exit=312 a0=3 a1=7ffe5cf7e890 a2=0 a3=0 items=0 ppid=2351 pid=2523 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:29.536000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4900444F434B45522D49534F4C4154494F4E2D53544147452D32002D6F00646F636B657230002D6A0044524F50 Jan 15 01:19:29.538081 systemd-networkd[1599]: docker0: Link UP Jan 15 01:19:29.542768 dockerd[2351]: time="2026-01-15T01:19:29.542709566Z" level=info msg="Loading containers: done." Jan 15 01:19:29.554995 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck2790521631-merged.mount: Deactivated successfully. Jan 15 01:19:29.578844 dockerd[2351]: time="2026-01-15T01:19:29.578741025Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Jan 15 01:19:29.579089 dockerd[2351]: time="2026-01-15T01:19:29.578859446Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Jan 15 01:19:29.579089 dockerd[2351]: time="2026-01-15T01:19:29.578959393Z" level=info msg="Initializing buildkit" Jan 15 01:19:29.603501 dockerd[2351]: time="2026-01-15T01:19:29.603426814Z" level=info msg="Completed buildkit initialization" Jan 15 01:19:29.611180 dockerd[2351]: time="2026-01-15T01:19:29.611027563Z" level=info msg="Daemon has completed initialization" Jan 15 01:19:29.611180 dockerd[2351]: time="2026-01-15T01:19:29.611090605Z" level=info msg="API listen on /run/docker.sock" Jan 15 01:19:29.611882 systemd[1]: Started docker.service - Docker Application Container Engine. Jan 15 01:19:29.611000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=docker comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:19:30.961775 containerd[1715]: time="2026-01-15T01:19:30.961724093Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.11\"" Jan 15 01:19:31.633098 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1074233011.mount: Deactivated successfully. Jan 15 01:19:32.625124 containerd[1715]: time="2026-01-15T01:19:32.625006509Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.32.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 01:19:32.626356 containerd[1715]: time="2026-01-15T01:19:32.626314332Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.32.11: active requests=0, bytes read=27401903" Jan 15 01:19:32.627246 containerd[1715]: time="2026-01-15T01:19:32.627195992Z" level=info msg="ImageCreate event name:\"sha256:7757c58248a29fc7474a8072796848689852b0477adf16765f38b3d1a9bacadf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 01:19:32.629238 containerd[1715]: time="2026-01-15T01:19:32.629176405Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:41eaecaed9af0ca8ab36d7794819c7df199e68c6c6ee0649114d713c495f8bd5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 01:19:32.630217 containerd[1715]: time="2026-01-15T01:19:32.629946173Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.32.11\" with image id \"sha256:7757c58248a29fc7474a8072796848689852b0477adf16765f38b3d1a9bacadf\", repo tag \"registry.k8s.io/kube-apiserver:v1.32.11\", repo digest \"registry.k8s.io/kube-apiserver@sha256:41eaecaed9af0ca8ab36d7794819c7df199e68c6c6ee0649114d713c495f8bd5\", size \"29067246\" in 1.668187037s" Jan 15 01:19:32.630217 containerd[1715]: time="2026-01-15T01:19:32.629984326Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.11\" returns image reference \"sha256:7757c58248a29fc7474a8072796848689852b0477adf16765f38b3d1a9bacadf\"" Jan 15 01:19:32.630849 containerd[1715]: time="2026-01-15T01:19:32.630792667Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.11\"" Jan 15 01:19:33.964043 containerd[1715]: time="2026-01-15T01:19:33.963954795Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.32.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 01:19:33.965832 containerd[1715]: time="2026-01-15T01:19:33.965787068Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.32.11: active requests=0, bytes read=24985199" Jan 15 01:19:33.966661 containerd[1715]: time="2026-01-15T01:19:33.966636342Z" level=info msg="ImageCreate event name:\"sha256:0175d0a8243db520e3caa6d5c1e4248fddbc32447a9e8b5f4630831bc1e2489e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 01:19:33.969694 containerd[1715]: time="2026-01-15T01:19:33.969658417Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:ce7b2ead5eef1a1554ef28b2b79596c6a8c6d506a87a7ab1381e77fe3d72f55f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 01:19:33.970406 containerd[1715]: time="2026-01-15T01:19:33.970364874Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.32.11\" with image id \"sha256:0175d0a8243db520e3caa6d5c1e4248fddbc32447a9e8b5f4630831bc1e2489e\", repo tag \"registry.k8s.io/kube-controller-manager:v1.32.11\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:ce7b2ead5eef1a1554ef28b2b79596c6a8c6d506a87a7ab1381e77fe3d72f55f\", size \"26650388\" in 1.339432287s" Jan 15 01:19:33.970456 containerd[1715]: time="2026-01-15T01:19:33.970404985Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.11\" returns image reference \"sha256:0175d0a8243db520e3caa6d5c1e4248fddbc32447a9e8b5f4630831bc1e2489e\"" Jan 15 01:19:33.971030 containerd[1715]: time="2026-01-15T01:19:33.970950856Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.11\"" Jan 15 01:19:34.191480 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 27. Jan 15 01:19:34.192925 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 15 01:19:34.445592 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 15 01:19:34.445000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:19:34.447244 kernel: kauditd_printk_skb: 132 callbacks suppressed Jan 15 01:19:34.447314 kernel: audit: type=1130 audit(1768439974.445:291): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:19:34.459782 (kubelet)[2632]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 15 01:19:34.514789 kubelet[2632]: E0115 01:19:34.514730 2632 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 15 01:19:34.516606 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 15 01:19:34.516740 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 15 01:19:34.516000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 15 01:19:34.517382 systemd[1]: kubelet.service: Consumed 155ms CPU time, 110.2M memory peak. Jan 15 01:19:34.521066 kernel: audit: type=1131 audit(1768439974.516:292): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 15 01:19:35.585361 containerd[1715]: time="2026-01-15T01:19:35.584610971Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.32.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 01:19:35.585728 containerd[1715]: time="2026-01-15T01:19:35.585710404Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.32.11: active requests=0, bytes read=19398302" Jan 15 01:19:35.586772 containerd[1715]: time="2026-01-15T01:19:35.586744437Z" level=info msg="ImageCreate event name:\"sha256:23d6a1fb92fda53b787f364351c610e55f073e8bdf0de5831974df7875b13f21\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 01:19:35.589089 containerd[1715]: time="2026-01-15T01:19:35.589069709Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:b3039587bbe70e61a6aeaff56c21fdeeef104524a31f835bcc80887d40b8e6b2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 01:19:35.589810 containerd[1715]: time="2026-01-15T01:19:35.589793961Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.32.11\" with image id \"sha256:23d6a1fb92fda53b787f364351c610e55f073e8bdf0de5831974df7875b13f21\", repo tag \"registry.k8s.io/kube-scheduler:v1.32.11\", repo digest \"registry.k8s.io/kube-scheduler@sha256:b3039587bbe70e61a6aeaff56c21fdeeef104524a31f835bcc80887d40b8e6b2\", size \"21062128\" in 1.618759262s" Jan 15 01:19:35.589879 containerd[1715]: time="2026-01-15T01:19:35.589869805Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.11\" returns image reference \"sha256:23d6a1fb92fda53b787f364351c610e55f073e8bdf0de5831974df7875b13f21\"" Jan 15 01:19:35.590794 containerd[1715]: time="2026-01-15T01:19:35.590780056Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.11\"" Jan 15 01:19:36.471555 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1445341944.mount: Deactivated successfully. Jan 15 01:19:37.332874 containerd[1715]: time="2026-01-15T01:19:37.332358211Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 01:19:37.334478 containerd[1715]: time="2026-01-15T01:19:37.334448009Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.11: active requests=0, bytes read=19572392" Jan 15 01:19:37.336129 containerd[1715]: time="2026-01-15T01:19:37.336097876Z" level=info msg="ImageCreate event name:\"sha256:4d8fb2dc5751966f058943ff7c5f10551e603d726ab8648c7c7b7f95a2663e3d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 01:19:37.339237 containerd[1715]: time="2026-01-15T01:19:37.338169878Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:4204f9136c23a867929d32046032fe069b49ad94cf168042405e7d0ec88bdba9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 01:19:37.339237 containerd[1715]: time="2026-01-15T01:19:37.338636980Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.11\" with image id \"sha256:4d8fb2dc5751966f058943ff7c5f10551e603d726ab8648c7c7b7f95a2663e3d\", repo tag \"registry.k8s.io/kube-proxy:v1.32.11\", repo digest \"registry.k8s.io/kube-proxy@sha256:4204f9136c23a867929d32046032fe069b49ad94cf168042405e7d0ec88bdba9\", size \"31160918\" in 1.74776888s" Jan 15 01:19:37.339237 containerd[1715]: time="2026-01-15T01:19:37.338657215Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.11\" returns image reference \"sha256:4d8fb2dc5751966f058943ff7c5f10551e603d726ab8648c7c7b7f95a2663e3d\"" Jan 15 01:19:37.339537 containerd[1715]: time="2026-01-15T01:19:37.339517037Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Jan 15 01:19:37.858945 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3552491599.mount: Deactivated successfully. Jan 15 01:19:38.458984 containerd[1715]: time="2026-01-15T01:19:38.458928456Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 01:19:38.460285 containerd[1715]: time="2026-01-15T01:19:38.460035353Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=0" Jan 15 01:19:38.462352 containerd[1715]: time="2026-01-15T01:19:38.462325145Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 01:19:38.465280 containerd[1715]: time="2026-01-15T01:19:38.465245185Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 01:19:38.466266 containerd[1715]: time="2026-01-15T01:19:38.466242689Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 1.12669963s" Jan 15 01:19:38.466338 containerd[1715]: time="2026-01-15T01:19:38.466328046Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" Jan 15 01:19:38.466842 containerd[1715]: time="2026-01-15T01:19:38.466744939Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Jan 15 01:19:38.995284 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4235845078.mount: Deactivated successfully. Jan 15 01:19:39.006054 containerd[1715]: time="2026-01-15T01:19:39.005507746Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 15 01:19:39.006594 containerd[1715]: time="2026-01-15T01:19:39.006400695Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Jan 15 01:19:39.007856 containerd[1715]: time="2026-01-15T01:19:39.007826413Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 15 01:19:39.010850 containerd[1715]: time="2026-01-15T01:19:39.010813088Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 15 01:19:39.011456 containerd[1715]: time="2026-01-15T01:19:39.011434372Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 544.49565ms" Jan 15 01:19:39.011534 containerd[1715]: time="2026-01-15T01:19:39.011521971Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Jan 15 01:19:39.012263 containerd[1715]: time="2026-01-15T01:19:39.012236825Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\"" Jan 15 01:19:39.579912 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3532194164.mount: Deactivated successfully. Jan 15 01:19:42.743387 containerd[1715]: time="2026-01-15T01:19:42.743317353Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.16-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 01:19:42.744464 containerd[1715]: time="2026-01-15T01:19:42.744286539Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.16-0: active requests=0, bytes read=45502580" Jan 15 01:19:42.746255 containerd[1715]: time="2026-01-15T01:19:42.746224067Z" level=info msg="ImageCreate event name:\"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 01:19:42.749412 containerd[1715]: time="2026-01-15T01:19:42.749383908Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 01:19:42.750585 containerd[1715]: time="2026-01-15T01:19:42.750044731Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.16-0\" with image id \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\", repo tag \"registry.k8s.io/etcd:3.5.16-0\", repo digest \"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\", size \"57680541\" in 3.73778246s" Jan 15 01:19:42.750585 containerd[1715]: time="2026-01-15T01:19:42.750076332Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\" returns image reference \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\"" Jan 15 01:19:44.691793 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 28. Jan 15 01:19:44.695159 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 15 01:19:44.830191 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 15 01:19:44.829000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:19:44.834151 kernel: audit: type=1130 audit(1768439984.829:293): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:19:44.839333 (kubelet)[2789]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 15 01:19:44.875717 kubelet[2789]: E0115 01:19:44.875678 2789 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 15 01:19:44.882372 kernel: audit: type=1131 audit(1768439984.877:294): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 15 01:19:44.877000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 15 01:19:44.877996 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 15 01:19:44.878132 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 15 01:19:44.878457 systemd[1]: kubelet.service: Consumed 133ms CPU time, 110.1M memory peak. Jan 15 01:19:45.247833 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 15 01:19:45.247000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:19:45.247000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:19:45.248922 systemd[1]: kubelet.service: Consumed 133ms CPU time, 110.1M memory peak. Jan 15 01:19:45.253789 kernel: audit: type=1130 audit(1768439985.247:295): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:19:45.253837 kernel: audit: type=1131 audit(1768439985.247:296): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:19:45.256536 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 15 01:19:45.295163 systemd[1]: Reload requested from client PID 2803 ('systemctl') (unit session-7.scope)... Jan 15 01:19:45.295179 systemd[1]: Reloading... Jan 15 01:19:45.395135 zram_generator::config[2852]: No configuration found. Jan 15 01:19:46.059082 systemd[1]: Reloading finished in 763 ms. Jan 15 01:19:46.079433 kernel: audit: type=1334 audit(1768439986.074:297): prog-id=63 op=LOAD Jan 15 01:19:46.079519 kernel: audit: type=1334 audit(1768439986.074:298): prog-id=52 op=UNLOAD Jan 15 01:19:46.074000 audit: BPF prog-id=63 op=LOAD Jan 15 01:19:46.074000 audit: BPF prog-id=52 op=UNLOAD Jan 15 01:19:46.074000 audit: BPF prog-id=64 op=LOAD Jan 15 01:19:46.083146 kernel: audit: type=1334 audit(1768439986.074:299): prog-id=64 op=LOAD Jan 15 01:19:46.083179 kernel: audit: type=1334 audit(1768439986.074:300): prog-id=65 op=LOAD Jan 15 01:19:46.074000 audit: BPF prog-id=65 op=LOAD Jan 15 01:19:46.074000 audit: BPF prog-id=53 op=UNLOAD Jan 15 01:19:46.084363 kernel: audit: type=1334 audit(1768439986.074:301): prog-id=53 op=UNLOAD Jan 15 01:19:46.074000 audit: BPF prog-id=54 op=UNLOAD Jan 15 01:19:46.085464 kernel: audit: type=1334 audit(1768439986.074:302): prog-id=54 op=UNLOAD Jan 15 01:19:46.078000 audit: BPF prog-id=66 op=LOAD Jan 15 01:19:46.078000 audit: BPF prog-id=45 op=UNLOAD Jan 15 01:19:46.078000 audit: BPF prog-id=67 op=LOAD Jan 15 01:19:46.079000 audit: BPF prog-id=68 op=LOAD Jan 15 01:19:46.079000 audit: BPF prog-id=46 op=UNLOAD Jan 15 01:19:46.079000 audit: BPF prog-id=47 op=UNLOAD Jan 15 01:19:46.079000 audit: BPF prog-id=69 op=LOAD Jan 15 01:19:46.079000 audit: BPF prog-id=48 op=UNLOAD Jan 15 01:19:46.079000 audit: BPF prog-id=70 op=LOAD Jan 15 01:19:46.079000 audit: BPF prog-id=71 op=LOAD Jan 15 01:19:46.079000 audit: BPF prog-id=49 op=UNLOAD Jan 15 01:19:46.079000 audit: BPF prog-id=50 op=UNLOAD Jan 15 01:19:46.082000 audit: BPF prog-id=72 op=LOAD Jan 15 01:19:46.082000 audit: BPF prog-id=55 op=UNLOAD Jan 15 01:19:46.082000 audit: BPF prog-id=73 op=LOAD Jan 15 01:19:46.082000 audit: BPF prog-id=74 op=LOAD Jan 15 01:19:46.082000 audit: BPF prog-id=56 op=UNLOAD Jan 15 01:19:46.082000 audit: BPF prog-id=57 op=UNLOAD Jan 15 01:19:46.082000 audit: BPF prog-id=75 op=LOAD Jan 15 01:19:46.088000 audit: BPF prog-id=76 op=LOAD Jan 15 01:19:46.088000 audit: BPF prog-id=43 op=UNLOAD Jan 15 01:19:46.088000 audit: BPF prog-id=44 op=UNLOAD Jan 15 01:19:46.088000 audit: BPF prog-id=77 op=LOAD Jan 15 01:19:46.088000 audit: BPF prog-id=58 op=UNLOAD Jan 15 01:19:46.089000 audit: BPF prog-id=78 op=LOAD Jan 15 01:19:46.089000 audit: BPF prog-id=60 op=UNLOAD Jan 15 01:19:46.090000 audit: BPF prog-id=79 op=LOAD Jan 15 01:19:46.090000 audit: BPF prog-id=80 op=LOAD Jan 15 01:19:46.090000 audit: BPF prog-id=61 op=UNLOAD Jan 15 01:19:46.090000 audit: BPF prog-id=62 op=UNLOAD Jan 15 01:19:46.090000 audit: BPF prog-id=81 op=LOAD Jan 15 01:19:46.090000 audit: BPF prog-id=59 op=UNLOAD Jan 15 01:19:46.091000 audit: BPF prog-id=82 op=LOAD Jan 15 01:19:46.091000 audit: BPF prog-id=51 op=UNLOAD Jan 15 01:19:46.104466 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Jan 15 01:19:46.104535 systemd[1]: kubelet.service: Failed with result 'signal'. Jan 15 01:19:46.104819 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 15 01:19:46.103000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 15 01:19:46.104870 systemd[1]: kubelet.service: Consumed 95ms CPU time, 98.4M memory peak. Jan 15 01:19:46.107237 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 15 01:19:46.988416 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 15 01:19:46.987000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:19:46.998406 (kubelet)[2903]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 15 01:19:47.037484 kubelet[2903]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 15 01:19:47.037484 kubelet[2903]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Jan 15 01:19:47.037484 kubelet[2903]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 15 01:19:47.039183 kubelet[2903]: I0115 01:19:47.037527 2903 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 15 01:19:47.332643 kubelet[2903]: I0115 01:19:47.332499 2903 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Jan 15 01:19:47.332792 kubelet[2903]: I0115 01:19:47.332783 2903 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 15 01:19:47.333342 kubelet[2903]: I0115 01:19:47.333326 2903 server.go:954] "Client rotation is on, will bootstrap in background" Jan 15 01:19:47.377696 kubelet[2903]: E0115 01:19:47.377649 2903 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.0.7.78:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.7.78:6443: connect: connection refused" logger="UnhandledError" Jan 15 01:19:47.378859 kubelet[2903]: I0115 01:19:47.378831 2903 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 15 01:19:47.388440 kubelet[2903]: I0115 01:19:47.388409 2903 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 15 01:19:47.391331 kubelet[2903]: I0115 01:19:47.391314 2903 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jan 15 01:19:47.391579 kubelet[2903]: I0115 01:19:47.391550 2903 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 15 01:19:47.391783 kubelet[2903]: I0115 01:19:47.391584 2903 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4515-1-0-n-d76f075714","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 15 01:19:47.391896 kubelet[2903]: I0115 01:19:47.391793 2903 topology_manager.go:138] "Creating topology manager with none policy" Jan 15 01:19:47.391896 kubelet[2903]: I0115 01:19:47.391803 2903 container_manager_linux.go:304] "Creating device plugin manager" Jan 15 01:19:47.391937 kubelet[2903]: I0115 01:19:47.391926 2903 state_mem.go:36] "Initialized new in-memory state store" Jan 15 01:19:47.397482 kubelet[2903]: I0115 01:19:47.397328 2903 kubelet.go:446] "Attempting to sync node with API server" Jan 15 01:19:47.397482 kubelet[2903]: I0115 01:19:47.397368 2903 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 15 01:19:47.397482 kubelet[2903]: I0115 01:19:47.397396 2903 kubelet.go:352] "Adding apiserver pod source" Jan 15 01:19:47.397482 kubelet[2903]: I0115 01:19:47.397410 2903 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 15 01:19:47.401981 kubelet[2903]: W0115 01:19:47.401184 2903 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.7.78:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4515-1-0-n-d76f075714&limit=500&resourceVersion=0": dial tcp 10.0.7.78:6443: connect: connection refused Jan 15 01:19:47.401981 kubelet[2903]: E0115 01:19:47.401270 2903 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.0.7.78:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4515-1-0-n-d76f075714&limit=500&resourceVersion=0\": dial tcp 10.0.7.78:6443: connect: connection refused" logger="UnhandledError" Jan 15 01:19:47.401981 kubelet[2903]: I0115 01:19:47.401384 2903 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.1.5" apiVersion="v1" Jan 15 01:19:47.401981 kubelet[2903]: I0115 01:19:47.401856 2903 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 15 01:19:47.402571 kubelet[2903]: W0115 01:19:47.402556 2903 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Jan 15 01:19:47.405394 kubelet[2903]: I0115 01:19:47.405369 2903 watchdog_linux.go:99] "Systemd watchdog is not enabled" Jan 15 01:19:47.405523 kubelet[2903]: I0115 01:19:47.405517 2903 server.go:1287] "Started kubelet" Jan 15 01:19:47.414080 kubelet[2903]: I0115 01:19:47.414025 2903 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Jan 15 01:19:47.415316 kubelet[2903]: I0115 01:19:47.415300 2903 server.go:479] "Adding debug handlers to kubelet server" Jan 15 01:19:47.417846 kubelet[2903]: I0115 01:19:47.417184 2903 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 15 01:19:47.418434 kubelet[2903]: I0115 01:19:47.417929 2903 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 15 01:19:47.418434 kubelet[2903]: I0115 01:19:47.418188 2903 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 15 01:19:47.420000 audit[2914]: NETFILTER_CFG table=mangle:42 family=2 entries=2 op=nft_register_chain pid=2914 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 01:19:47.420000 audit[2914]: SYSCALL arch=c000003e syscall=46 success=yes exit=136 a0=3 a1=7ffe56a84440 a2=0 a3=0 items=0 ppid=2903 pid=2914 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:47.420000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Jan 15 01:19:47.421000 audit[2915]: NETFILTER_CFG table=filter:43 family=2 entries=1 op=nft_register_chain pid=2915 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 01:19:47.421000 audit[2915]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffd74db9da0 a2=0 a3=0 items=0 ppid=2903 pid=2915 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:47.421000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4649524557414C4C002D740066696C746572 Jan 15 01:19:47.425696 kubelet[2903]: I0115 01:19:47.425420 2903 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jan 15 01:19:47.426859 kubelet[2903]: W0115 01:19:47.426778 2903 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.7.78:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.0.7.78:6443: connect: connection refused Jan 15 01:19:47.426982 kubelet[2903]: E0115 01:19:47.426968 2903 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.0.7.78:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.7.78:6443: connect: connection refused" logger="UnhandledError" Jan 15 01:19:47.430903 kubelet[2903]: E0115 01:19:47.429204 2903 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.7.78:6443/api/v1/namespaces/default/events\": dial tcp 10.0.7.78:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4515-1-0-n-d76f075714.188ac2cc87e687d6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4515-1-0-n-d76f075714,UID:ci-4515-1-0-n-d76f075714,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4515-1-0-n-d76f075714,},FirstTimestamp:2026-01-15 01:19:47.405490134 +0000 UTC m=+0.403576104,LastTimestamp:2026-01-15 01:19:47.405490134 +0000 UTC m=+0.403576104,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4515-1-0-n-d76f075714,}" Jan 15 01:19:47.431132 kubelet[2903]: I0115 01:19:47.430966 2903 volume_manager.go:297] "Starting Kubelet Volume Manager" Jan 15 01:19:47.431265 kubelet[2903]: E0115 01:19:47.431222 2903 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4515-1-0-n-d76f075714\" not found" Jan 15 01:19:47.431872 kubelet[2903]: I0115 01:19:47.431809 2903 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Jan 15 01:19:47.431912 kubelet[2903]: I0115 01:19:47.431887 2903 reconciler.go:26] "Reconciler: start to sync state" Jan 15 01:19:47.433000 audit[2917]: NETFILTER_CFG table=filter:44 family=2 entries=2 op=nft_register_chain pid=2917 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 01:19:47.433000 audit[2917]: SYSCALL arch=c000003e syscall=46 success=yes exit=340 a0=3 a1=7ffed9c32b00 a2=0 a3=0 items=0 ppid=2903 pid=2917 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:47.433000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 15 01:19:47.435248 kubelet[2903]: W0115 01:19:47.433828 2903 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.7.78:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.7.78:6443: connect: connection refused Jan 15 01:19:47.435248 kubelet[2903]: E0115 01:19:47.434136 2903 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.0.7.78:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.7.78:6443: connect: connection refused" logger="UnhandledError" Jan 15 01:19:47.435248 kubelet[2903]: E0115 01:19:47.434308 2903 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.7.78:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4515-1-0-n-d76f075714?timeout=10s\": dial tcp 10.0.7.78:6443: connect: connection refused" interval="200ms" Jan 15 01:19:47.436464 kubelet[2903]: I0115 01:19:47.436437 2903 factory.go:221] Registration of the systemd container factory successfully Jan 15 01:19:47.436568 kubelet[2903]: I0115 01:19:47.436541 2903 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 15 01:19:47.436000 audit[2919]: NETFILTER_CFG table=filter:45 family=2 entries=2 op=nft_register_chain pid=2919 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 01:19:47.436000 audit[2919]: SYSCALL arch=c000003e syscall=46 success=yes exit=340 a0=3 a1=7ffc7e8c8df0 a2=0 a3=0 items=0 ppid=2903 pid=2919 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:47.436000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 15 01:19:47.441040 kubelet[2903]: E0115 01:19:47.440939 2903 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jan 15 01:19:47.441040 kubelet[2903]: I0115 01:19:47.441043 2903 factory.go:221] Registration of the containerd container factory successfully Jan 15 01:19:47.450000 audit[2923]: NETFILTER_CFG table=filter:46 family=2 entries=1 op=nft_register_rule pid=2923 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 01:19:47.450000 audit[2923]: SYSCALL arch=c000003e syscall=46 success=yes exit=924 a0=3 a1=7fffed79c540 a2=0 a3=0 items=0 ppid=2903 pid=2923 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:47.450000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D41004B5542452D4649524557414C4C002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E7400626C6F636B20696E636F6D696E67206C6F63616C6E657420636F6E6E656374696F6E73002D2D647374003132372E302E302E302F38 Jan 15 01:19:47.452350 kubelet[2903]: I0115 01:19:47.452213 2903 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 15 01:19:47.452000 audit[2924]: NETFILTER_CFG table=mangle:47 family=10 entries=2 op=nft_register_chain pid=2924 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 01:19:47.452000 audit[2924]: SYSCALL arch=c000003e syscall=46 success=yes exit=136 a0=3 a1=7ffd0ad82940 a2=0 a3=0 items=0 ppid=2903 pid=2924 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:47.452000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Jan 15 01:19:47.453577 kubelet[2903]: I0115 01:19:47.453558 2903 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 15 01:19:47.453645 kubelet[2903]: I0115 01:19:47.453639 2903 status_manager.go:227] "Starting to sync pod status with apiserver" Jan 15 01:19:47.453694 kubelet[2903]: I0115 01:19:47.453688 2903 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Jan 15 01:19:47.453727 kubelet[2903]: I0115 01:19:47.453722 2903 kubelet.go:2382] "Starting kubelet main sync loop" Jan 15 01:19:47.453813 kubelet[2903]: E0115 01:19:47.453799 2903 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 15 01:19:47.454000 audit[2925]: NETFILTER_CFG table=mangle:48 family=2 entries=1 op=nft_register_chain pid=2925 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 01:19:47.454000 audit[2925]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffc56605a60 a2=0 a3=0 items=0 ppid=2903 pid=2925 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:47.454000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Jan 15 01:19:47.455000 audit[2926]: NETFILTER_CFG table=nat:49 family=2 entries=1 op=nft_register_chain pid=2926 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 01:19:47.455000 audit[2926]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffef1e29930 a2=0 a3=0 items=0 ppid=2903 pid=2926 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:47.455000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Jan 15 01:19:47.456000 audit[2927]: NETFILTER_CFG table=filter:50 family=2 entries=1 op=nft_register_chain pid=2927 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 01:19:47.456000 audit[2927]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffd533a5bc0 a2=0 a3=0 items=0 ppid=2903 pid=2927 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:47.456000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Jan 15 01:19:47.457000 audit[2928]: NETFILTER_CFG table=mangle:51 family=10 entries=1 op=nft_register_chain pid=2928 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 01:19:47.457000 audit[2928]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffef47fc050 a2=0 a3=0 items=0 ppid=2903 pid=2928 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:47.457000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Jan 15 01:19:47.458000 audit[2929]: NETFILTER_CFG table=nat:52 family=10 entries=1 op=nft_register_chain pid=2929 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 01:19:47.458000 audit[2929]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffec4b46be0 a2=0 a3=0 items=0 ppid=2903 pid=2929 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:47.458000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Jan 15 01:19:47.459000 audit[2930]: NETFILTER_CFG table=filter:53 family=10 entries=1 op=nft_register_chain pid=2930 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 01:19:47.459000 audit[2930]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffee43786c0 a2=0 a3=0 items=0 ppid=2903 pid=2930 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:47.459000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Jan 15 01:19:47.462614 kubelet[2903]: W0115 01:19:47.462530 2903 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.7.78:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.7.78:6443: connect: connection refused Jan 15 01:19:47.462738 kubelet[2903]: E0115 01:19:47.462711 2903 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.0.7.78:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.7.78:6443: connect: connection refused" logger="UnhandledError" Jan 15 01:19:47.471735 kubelet[2903]: I0115 01:19:47.471505 2903 cpu_manager.go:221] "Starting CPU manager" policy="none" Jan 15 01:19:47.471735 kubelet[2903]: I0115 01:19:47.471520 2903 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Jan 15 01:19:47.471735 kubelet[2903]: I0115 01:19:47.471542 2903 state_mem.go:36] "Initialized new in-memory state store" Jan 15 01:19:47.473889 kubelet[2903]: I0115 01:19:47.473676 2903 policy_none.go:49] "None policy: Start" Jan 15 01:19:47.473889 kubelet[2903]: I0115 01:19:47.473697 2903 memory_manager.go:186] "Starting memorymanager" policy="None" Jan 15 01:19:47.473889 kubelet[2903]: I0115 01:19:47.473709 2903 state_mem.go:35] "Initializing new in-memory state store" Jan 15 01:19:47.479365 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Jan 15 01:19:47.488901 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Jan 15 01:19:47.491751 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Jan 15 01:19:47.507999 kubelet[2903]: I0115 01:19:47.507972 2903 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 15 01:19:47.508206 kubelet[2903]: I0115 01:19:47.508185 2903 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 15 01:19:47.508232 kubelet[2903]: I0115 01:19:47.508196 2903 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 15 01:19:47.509029 kubelet[2903]: I0115 01:19:47.508974 2903 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 15 01:19:47.509406 kubelet[2903]: E0115 01:19:47.509382 2903 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Jan 15 01:19:47.509608 kubelet[2903]: E0115 01:19:47.509596 2903 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4515-1-0-n-d76f075714\" not found" Jan 15 01:19:47.562764 systemd[1]: Created slice kubepods-burstable-pod95a27261a5eb27833e5aae5be403ab0c.slice - libcontainer container kubepods-burstable-pod95a27261a5eb27833e5aae5be403ab0c.slice. Jan 15 01:19:47.581077 kubelet[2903]: E0115 01:19:47.578957 2903 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4515-1-0-n-d76f075714\" not found" node="ci-4515-1-0-n-d76f075714" Jan 15 01:19:47.585108 systemd[1]: Created slice kubepods-burstable-podc12e4e6a98d84f27efaba5c27ae424e3.slice - libcontainer container kubepods-burstable-podc12e4e6a98d84f27efaba5c27ae424e3.slice. Jan 15 01:19:47.590811 kubelet[2903]: E0115 01:19:47.590758 2903 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4515-1-0-n-d76f075714\" not found" node="ci-4515-1-0-n-d76f075714" Jan 15 01:19:47.591253 systemd[1]: Created slice kubepods-burstable-pod73d69104894a5a57f7b5508475370384.slice - libcontainer container kubepods-burstable-pod73d69104894a5a57f7b5508475370384.slice. Jan 15 01:19:47.594154 kubelet[2903]: E0115 01:19:47.594101 2903 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4515-1-0-n-d76f075714\" not found" node="ci-4515-1-0-n-d76f075714" Jan 15 01:19:47.610739 kubelet[2903]: I0115 01:19:47.610377 2903 kubelet_node_status.go:75] "Attempting to register node" node="ci-4515-1-0-n-d76f075714" Jan 15 01:19:47.610739 kubelet[2903]: E0115 01:19:47.610699 2903 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.7.78:6443/api/v1/nodes\": dial tcp 10.0.7.78:6443: connect: connection refused" node="ci-4515-1-0-n-d76f075714" Jan 15 01:19:47.632460 kubelet[2903]: I0115 01:19:47.632404 2903 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/95a27261a5eb27833e5aae5be403ab0c-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4515-1-0-n-d76f075714\" (UID: \"95a27261a5eb27833e5aae5be403ab0c\") " pod="kube-system/kube-apiserver-ci-4515-1-0-n-d76f075714" Jan 15 01:19:47.632460 kubelet[2903]: I0115 01:19:47.632446 2903 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/c12e4e6a98d84f27efaba5c27ae424e3-kubeconfig\") pod \"kube-controller-manager-ci-4515-1-0-n-d76f075714\" (UID: \"c12e4e6a98d84f27efaba5c27ae424e3\") " pod="kube-system/kube-controller-manager-ci-4515-1-0-n-d76f075714" Jan 15 01:19:47.632460 kubelet[2903]: I0115 01:19:47.632464 2903 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/73d69104894a5a57f7b5508475370384-kubeconfig\") pod \"kube-scheduler-ci-4515-1-0-n-d76f075714\" (UID: \"73d69104894a5a57f7b5508475370384\") " pod="kube-system/kube-scheduler-ci-4515-1-0-n-d76f075714" Jan 15 01:19:47.632648 kubelet[2903]: I0115 01:19:47.632480 2903 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/95a27261a5eb27833e5aae5be403ab0c-ca-certs\") pod \"kube-apiserver-ci-4515-1-0-n-d76f075714\" (UID: \"95a27261a5eb27833e5aae5be403ab0c\") " pod="kube-system/kube-apiserver-ci-4515-1-0-n-d76f075714" Jan 15 01:19:47.632648 kubelet[2903]: I0115 01:19:47.632497 2903 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/95a27261a5eb27833e5aae5be403ab0c-k8s-certs\") pod \"kube-apiserver-ci-4515-1-0-n-d76f075714\" (UID: \"95a27261a5eb27833e5aae5be403ab0c\") " pod="kube-system/kube-apiserver-ci-4515-1-0-n-d76f075714" Jan 15 01:19:47.632648 kubelet[2903]: I0115 01:19:47.632511 2903 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/c12e4e6a98d84f27efaba5c27ae424e3-k8s-certs\") pod \"kube-controller-manager-ci-4515-1-0-n-d76f075714\" (UID: \"c12e4e6a98d84f27efaba5c27ae424e3\") " pod="kube-system/kube-controller-manager-ci-4515-1-0-n-d76f075714" Jan 15 01:19:47.632648 kubelet[2903]: I0115 01:19:47.632525 2903 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/c12e4e6a98d84f27efaba5c27ae424e3-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4515-1-0-n-d76f075714\" (UID: \"c12e4e6a98d84f27efaba5c27ae424e3\") " pod="kube-system/kube-controller-manager-ci-4515-1-0-n-d76f075714" Jan 15 01:19:47.632648 kubelet[2903]: I0115 01:19:47.632544 2903 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/c12e4e6a98d84f27efaba5c27ae424e3-ca-certs\") pod \"kube-controller-manager-ci-4515-1-0-n-d76f075714\" (UID: \"c12e4e6a98d84f27efaba5c27ae424e3\") " pod="kube-system/kube-controller-manager-ci-4515-1-0-n-d76f075714" Jan 15 01:19:47.632760 kubelet[2903]: I0115 01:19:47.632567 2903 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/c12e4e6a98d84f27efaba5c27ae424e3-flexvolume-dir\") pod \"kube-controller-manager-ci-4515-1-0-n-d76f075714\" (UID: \"c12e4e6a98d84f27efaba5c27ae424e3\") " pod="kube-system/kube-controller-manager-ci-4515-1-0-n-d76f075714" Jan 15 01:19:47.634845 kubelet[2903]: E0115 01:19:47.634802 2903 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.7.78:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4515-1-0-n-d76f075714?timeout=10s\": dial tcp 10.0.7.78:6443: connect: connection refused" interval="400ms" Jan 15 01:19:47.813106 kubelet[2903]: I0115 01:19:47.813001 2903 kubelet_node_status.go:75] "Attempting to register node" node="ci-4515-1-0-n-d76f075714" Jan 15 01:19:47.813525 kubelet[2903]: E0115 01:19:47.813503 2903 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.7.78:6443/api/v1/nodes\": dial tcp 10.0.7.78:6443: connect: connection refused" node="ci-4515-1-0-n-d76f075714" Jan 15 01:19:47.881078 containerd[1715]: time="2026-01-15T01:19:47.880985767Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4515-1-0-n-d76f075714,Uid:95a27261a5eb27833e5aae5be403ab0c,Namespace:kube-system,Attempt:0,}" Jan 15 01:19:47.894709 containerd[1715]: time="2026-01-15T01:19:47.894598923Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4515-1-0-n-d76f075714,Uid:c12e4e6a98d84f27efaba5c27ae424e3,Namespace:kube-system,Attempt:0,}" Jan 15 01:19:47.894974 containerd[1715]: time="2026-01-15T01:19:47.894935818Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4515-1-0-n-d76f075714,Uid:73d69104894a5a57f7b5508475370384,Namespace:kube-system,Attempt:0,}" Jan 15 01:19:48.035837 kubelet[2903]: E0115 01:19:48.035782 2903 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.7.78:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4515-1-0-n-d76f075714?timeout=10s\": dial tcp 10.0.7.78:6443: connect: connection refused" interval="800ms" Jan 15 01:19:48.215190 kubelet[2903]: I0115 01:19:48.215120 2903 kubelet_node_status.go:75] "Attempting to register node" node="ci-4515-1-0-n-d76f075714" Jan 15 01:19:48.215856 kubelet[2903]: E0115 01:19:48.215831 2903 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.7.78:6443/api/v1/nodes\": dial tcp 10.0.7.78:6443: connect: connection refused" node="ci-4515-1-0-n-d76f075714" Jan 15 01:19:48.310126 kubelet[2903]: W0115 01:19:48.310058 2903 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.7.78:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.7.78:6443: connect: connection refused Jan 15 01:19:48.310126 kubelet[2903]: E0115 01:19:48.310103 2903 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.0.7.78:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.7.78:6443: connect: connection refused" logger="UnhandledError" Jan 15 01:19:48.479317 kubelet[2903]: W0115 01:19:48.479031 2903 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.7.78:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.7.78:6443: connect: connection refused Jan 15 01:19:48.479317 kubelet[2903]: E0115 01:19:48.479120 2903 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.0.7.78:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.7.78:6443: connect: connection refused" logger="UnhandledError" Jan 15 01:19:48.550639 kubelet[2903]: E0115 01:19:48.550530 2903 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.7.78:6443/api/v1/namespaces/default/events\": dial tcp 10.0.7.78:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4515-1-0-n-d76f075714.188ac2cc87e687d6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4515-1-0-n-d76f075714,UID:ci-4515-1-0-n-d76f075714,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4515-1-0-n-d76f075714,},FirstTimestamp:2026-01-15 01:19:47.405490134 +0000 UTC m=+0.403576104,LastTimestamp:2026-01-15 01:19:47.405490134 +0000 UTC m=+0.403576104,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4515-1-0-n-d76f075714,}" Jan 15 01:19:48.764291 containerd[1715]: time="2026-01-15T01:19:48.764165029Z" level=info msg="connecting to shim d70fb3c91a76704820c702dd29ae91967387196354715a9ffca1bc524c155754" address="unix:///run/containerd/s/21570c8bc77ea03514cf7c68200556c44faf87937f199c1305516806677c4467" namespace=k8s.io protocol=ttrpc version=3 Jan 15 01:19:48.785041 containerd[1715]: time="2026-01-15T01:19:48.784927925Z" level=info msg="connecting to shim 791331ecc3fbcfd3a419281927a10fa61fb8ebab84bb774fe5593cd2d9845502" address="unix:///run/containerd/s/0b0c092080c89c2f3d2d000bf1ff3ae60a137856226a0957d93f90b870a6935b" namespace=k8s.io protocol=ttrpc version=3 Jan 15 01:19:48.790901 containerd[1715]: time="2026-01-15T01:19:48.789820755Z" level=info msg="connecting to shim 279a845c3e86c0ac7439cf5f39d2ce421857c446009e53127181cc6fc9face3d" address="unix:///run/containerd/s/46514ef6c2b9f049ad34480bfe0774b14eccfd40b36a55a935f7db475f6febe3" namespace=k8s.io protocol=ttrpc version=3 Jan 15 01:19:48.795459 systemd[1]: Started cri-containerd-d70fb3c91a76704820c702dd29ae91967387196354715a9ffca1bc524c155754.scope - libcontainer container d70fb3c91a76704820c702dd29ae91967387196354715a9ffca1bc524c155754. Jan 15 01:19:48.813000 audit: BPF prog-id=83 op=LOAD Jan 15 01:19:48.814000 audit: BPF prog-id=84 op=LOAD Jan 15 01:19:48.814000 audit[2956]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=2943 pid=2956 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:48.814000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6437306662336339316137363730343832306337303264643239616539 Jan 15 01:19:48.814000 audit: BPF prog-id=84 op=UNLOAD Jan 15 01:19:48.814000 audit[2956]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2943 pid=2956 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:48.814000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6437306662336339316137363730343832306337303264643239616539 Jan 15 01:19:48.814000 audit: BPF prog-id=85 op=LOAD Jan 15 01:19:48.814000 audit[2956]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=2943 pid=2956 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:48.814000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6437306662336339316137363730343832306337303264643239616539 Jan 15 01:19:48.815000 audit: BPF prog-id=86 op=LOAD Jan 15 01:19:48.815000 audit[2956]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=2943 pid=2956 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:48.815000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6437306662336339316137363730343832306337303264643239616539 Jan 15 01:19:48.815000 audit: BPF prog-id=86 op=UNLOAD Jan 15 01:19:48.815000 audit[2956]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2943 pid=2956 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:48.815000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6437306662336339316137363730343832306337303264643239616539 Jan 15 01:19:48.815000 audit: BPF prog-id=85 op=UNLOAD Jan 15 01:19:48.815000 audit[2956]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2943 pid=2956 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:48.815000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6437306662336339316137363730343832306337303264643239616539 Jan 15 01:19:48.815000 audit: BPF prog-id=87 op=LOAD Jan 15 01:19:48.815000 audit[2956]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=2943 pid=2956 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:48.815000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6437306662336339316137363730343832306337303264643239616539 Jan 15 01:19:48.835230 systemd[1]: Started cri-containerd-279a845c3e86c0ac7439cf5f39d2ce421857c446009e53127181cc6fc9face3d.scope - libcontainer container 279a845c3e86c0ac7439cf5f39d2ce421857c446009e53127181cc6fc9face3d. Jan 15 01:19:48.837473 kubelet[2903]: E0115 01:19:48.836449 2903 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.7.78:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4515-1-0-n-d76f075714?timeout=10s\": dial tcp 10.0.7.78:6443: connect: connection refused" interval="1.6s" Jan 15 01:19:48.840410 systemd[1]: Started cri-containerd-791331ecc3fbcfd3a419281927a10fa61fb8ebab84bb774fe5593cd2d9845502.scope - libcontainer container 791331ecc3fbcfd3a419281927a10fa61fb8ebab84bb774fe5593cd2d9845502. Jan 15 01:19:48.863000 audit: BPF prog-id=88 op=LOAD Jan 15 01:19:48.864000 audit: BPF prog-id=89 op=LOAD Jan 15 01:19:48.864000 audit[3012]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a8238 a2=98 a3=0 items=0 ppid=2986 pid=3012 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:48.864000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3237396138343563336538366330616337343339636635663339643263 Jan 15 01:19:48.864000 audit: BPF prog-id=89 op=UNLOAD Jan 15 01:19:48.864000 audit[3012]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2986 pid=3012 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:48.864000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3237396138343563336538366330616337343339636635663339643263 Jan 15 01:19:48.864000 audit: BPF prog-id=90 op=LOAD Jan 15 01:19:48.864000 audit[3012]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a8488 a2=98 a3=0 items=0 ppid=2986 pid=3012 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:48.864000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3237396138343563336538366330616337343339636635663339643263 Jan 15 01:19:48.864000 audit: BPF prog-id=91 op=LOAD Jan 15 01:19:48.864000 audit[3012]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c0001a8218 a2=98 a3=0 items=0 ppid=2986 pid=3012 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:48.864000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3237396138343563336538366330616337343339636635663339643263 Jan 15 01:19:48.864000 audit: BPF prog-id=91 op=UNLOAD Jan 15 01:19:48.864000 audit[3012]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2986 pid=3012 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:48.864000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3237396138343563336538366330616337343339636635663339643263 Jan 15 01:19:48.864000 audit: BPF prog-id=90 op=UNLOAD Jan 15 01:19:48.864000 audit[3012]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2986 pid=3012 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:48.864000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3237396138343563336538366330616337343339636635663339643263 Jan 15 01:19:48.864000 audit: BPF prog-id=92 op=LOAD Jan 15 01:19:48.864000 audit[3012]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a86e8 a2=98 a3=0 items=0 ppid=2986 pid=3012 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:48.864000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3237396138343563336538366330616337343339636635663339643263 Jan 15 01:19:48.869000 audit: BPF prog-id=93 op=LOAD Jan 15 01:19:48.870000 audit: BPF prog-id=94 op=LOAD Jan 15 01:19:48.870000 audit[3016]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001f8238 a2=98 a3=0 items=0 ppid=2974 pid=3016 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:48.870000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3739313333316563633366626366643361343139323831393237613130 Jan 15 01:19:48.870000 audit: BPF prog-id=94 op=UNLOAD Jan 15 01:19:48.870000 audit[3016]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2974 pid=3016 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:48.870000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3739313333316563633366626366643361343139323831393237613130 Jan 15 01:19:48.870000 audit: BPF prog-id=95 op=LOAD Jan 15 01:19:48.870000 audit[3016]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001f8488 a2=98 a3=0 items=0 ppid=2974 pid=3016 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:48.870000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3739313333316563633366626366643361343139323831393237613130 Jan 15 01:19:48.870000 audit: BPF prog-id=96 op=LOAD Jan 15 01:19:48.870000 audit[3016]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c0001f8218 a2=98 a3=0 items=0 ppid=2974 pid=3016 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:48.870000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3739313333316563633366626366643361343139323831393237613130 Jan 15 01:19:48.871000 audit: BPF prog-id=96 op=UNLOAD Jan 15 01:19:48.871000 audit[3016]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2974 pid=3016 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:48.871000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3739313333316563633366626366643361343139323831393237613130 Jan 15 01:19:48.871000 audit: BPF prog-id=95 op=UNLOAD Jan 15 01:19:48.871000 audit[3016]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2974 pid=3016 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:48.871000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3739313333316563633366626366643361343139323831393237613130 Jan 15 01:19:48.871000 audit: BPF prog-id=97 op=LOAD Jan 15 01:19:48.871000 audit[3016]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001f86e8 a2=98 a3=0 items=0 ppid=2974 pid=3016 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:48.871000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3739313333316563633366626366643361343139323831393237613130 Jan 15 01:19:48.875144 containerd[1715]: time="2026-01-15T01:19:48.875108476Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4515-1-0-n-d76f075714,Uid:95a27261a5eb27833e5aae5be403ab0c,Namespace:kube-system,Attempt:0,} returns sandbox id \"d70fb3c91a76704820c702dd29ae91967387196354715a9ffca1bc524c155754\"" Jan 15 01:19:48.880061 containerd[1715]: time="2026-01-15T01:19:48.880032021Z" level=info msg="CreateContainer within sandbox \"d70fb3c91a76704820c702dd29ae91967387196354715a9ffca1bc524c155754\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Jan 15 01:19:48.891438 containerd[1715]: time="2026-01-15T01:19:48.891402855Z" level=info msg="Container 3dc3cd569446c1b42cdca87f99320569116bd28fe2717e239d4e7ffbe1e35e8b: CDI devices from CRI Config.CDIDevices: []" Jan 15 01:19:48.903334 containerd[1715]: time="2026-01-15T01:19:48.903162166Z" level=info msg="CreateContainer within sandbox \"d70fb3c91a76704820c702dd29ae91967387196354715a9ffca1bc524c155754\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"3dc3cd569446c1b42cdca87f99320569116bd28fe2717e239d4e7ffbe1e35e8b\"" Jan 15 01:19:48.904282 containerd[1715]: time="2026-01-15T01:19:48.904202913Z" level=info msg="StartContainer for \"3dc3cd569446c1b42cdca87f99320569116bd28fe2717e239d4e7ffbe1e35e8b\"" Jan 15 01:19:48.906079 containerd[1715]: time="2026-01-15T01:19:48.905956812Z" level=info msg="connecting to shim 3dc3cd569446c1b42cdca87f99320569116bd28fe2717e239d4e7ffbe1e35e8b" address="unix:///run/containerd/s/21570c8bc77ea03514cf7c68200556c44faf87937f199c1305516806677c4467" protocol=ttrpc version=3 Jan 15 01:19:48.914650 kubelet[2903]: W0115 01:19:48.914473 2903 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.7.78:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4515-1-0-n-d76f075714&limit=500&resourceVersion=0": dial tcp 10.0.7.78:6443: connect: connection refused Jan 15 01:19:48.914846 kubelet[2903]: E0115 01:19:48.914730 2903 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.0.7.78:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4515-1-0-n-d76f075714&limit=500&resourceVersion=0\": dial tcp 10.0.7.78:6443: connect: connection refused" logger="UnhandledError" Jan 15 01:19:48.933526 containerd[1715]: time="2026-01-15T01:19:48.933461518Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4515-1-0-n-d76f075714,Uid:73d69104894a5a57f7b5508475370384,Namespace:kube-system,Attempt:0,} returns sandbox id \"791331ecc3fbcfd3a419281927a10fa61fb8ebab84bb774fe5593cd2d9845502\"" Jan 15 01:19:48.934689 containerd[1715]: time="2026-01-15T01:19:48.934285995Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4515-1-0-n-d76f075714,Uid:c12e4e6a98d84f27efaba5c27ae424e3,Namespace:kube-system,Attempt:0,} returns sandbox id \"279a845c3e86c0ac7439cf5f39d2ce421857c446009e53127181cc6fc9face3d\"" Jan 15 01:19:48.937195 containerd[1715]: time="2026-01-15T01:19:48.937127236Z" level=info msg="CreateContainer within sandbox \"279a845c3e86c0ac7439cf5f39d2ce421857c446009e53127181cc6fc9face3d\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Jan 15 01:19:48.938171 containerd[1715]: time="2026-01-15T01:19:48.938148380Z" level=info msg="CreateContainer within sandbox \"791331ecc3fbcfd3a419281927a10fa61fb8ebab84bb774fe5593cd2d9845502\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Jan 15 01:19:48.942369 systemd[1]: Started cri-containerd-3dc3cd569446c1b42cdca87f99320569116bd28fe2717e239d4e7ffbe1e35e8b.scope - libcontainer container 3dc3cd569446c1b42cdca87f99320569116bd28fe2717e239d4e7ffbe1e35e8b. Jan 15 01:19:48.945749 containerd[1715]: time="2026-01-15T01:19:48.945717579Z" level=info msg="Container 9093d92bc22a002b0a04f61e69336bc6100d020a7f24d5d044037bb29d99be8f: CDI devices from CRI Config.CDIDevices: []" Jan 15 01:19:48.954000 audit: BPF prog-id=98 op=LOAD Jan 15 01:19:48.955000 audit: BPF prog-id=99 op=LOAD Jan 15 01:19:48.955000 audit[3062]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=2943 pid=3062 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:48.955000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3364633363643536393434366331623432636463613837663939333230 Jan 15 01:19:48.955000 audit: BPF prog-id=99 op=UNLOAD Jan 15 01:19:48.955000 audit[3062]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2943 pid=3062 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:48.955000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3364633363643536393434366331623432636463613837663939333230 Jan 15 01:19:48.955000 audit: BPF prog-id=100 op=LOAD Jan 15 01:19:48.955000 audit[3062]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=2943 pid=3062 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:48.955000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3364633363643536393434366331623432636463613837663939333230 Jan 15 01:19:48.955000 audit: BPF prog-id=101 op=LOAD Jan 15 01:19:48.955000 audit[3062]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=2943 pid=3062 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:48.955000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3364633363643536393434366331623432636463613837663939333230 Jan 15 01:19:48.955000 audit: BPF prog-id=101 op=UNLOAD Jan 15 01:19:48.955000 audit[3062]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2943 pid=3062 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:48.955000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3364633363643536393434366331623432636463613837663939333230 Jan 15 01:19:48.955000 audit: BPF prog-id=100 op=UNLOAD Jan 15 01:19:48.955000 audit[3062]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2943 pid=3062 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:48.955000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3364633363643536393434366331623432636463613837663939333230 Jan 15 01:19:48.955000 audit: BPF prog-id=102 op=LOAD Jan 15 01:19:48.955000 audit[3062]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=2943 pid=3062 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:48.955000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3364633363643536393434366331623432636463613837663939333230 Jan 15 01:19:48.969686 containerd[1715]: time="2026-01-15T01:19:48.969644001Z" level=info msg="CreateContainer within sandbox \"279a845c3e86c0ac7439cf5f39d2ce421857c446009e53127181cc6fc9face3d\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"9093d92bc22a002b0a04f61e69336bc6100d020a7f24d5d044037bb29d99be8f\"" Jan 15 01:19:48.970411 containerd[1715]: time="2026-01-15T01:19:48.970389762Z" level=info msg="StartContainer for \"9093d92bc22a002b0a04f61e69336bc6100d020a7f24d5d044037bb29d99be8f\"" Jan 15 01:19:48.971547 containerd[1715]: time="2026-01-15T01:19:48.971514818Z" level=info msg="connecting to shim 9093d92bc22a002b0a04f61e69336bc6100d020a7f24d5d044037bb29d99be8f" address="unix:///run/containerd/s/46514ef6c2b9f049ad34480bfe0774b14eccfd40b36a55a935f7db475f6febe3" protocol=ttrpc version=3 Jan 15 01:19:48.973028 containerd[1715]: time="2026-01-15T01:19:48.972994959Z" level=info msg="Container e1a9a69f281c4939cb444ed1bfc85b218a064c52a9c47a646377d7c719b09116: CDI devices from CRI Config.CDIDevices: []" Jan 15 01:19:48.982164 containerd[1715]: time="2026-01-15T01:19:48.982070922Z" level=info msg="CreateContainer within sandbox \"791331ecc3fbcfd3a419281927a10fa61fb8ebab84bb774fe5593cd2d9845502\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"e1a9a69f281c4939cb444ed1bfc85b218a064c52a9c47a646377d7c719b09116\"" Jan 15 01:19:48.982907 containerd[1715]: time="2026-01-15T01:19:48.982840238Z" level=info msg="StartContainer for \"e1a9a69f281c4939cb444ed1bfc85b218a064c52a9c47a646377d7c719b09116\"" Jan 15 01:19:48.983782 containerd[1715]: time="2026-01-15T01:19:48.983727106Z" level=info msg="connecting to shim e1a9a69f281c4939cb444ed1bfc85b218a064c52a9c47a646377d7c719b09116" address="unix:///run/containerd/s/0b0c092080c89c2f3d2d000bf1ff3ae60a137856226a0957d93f90b870a6935b" protocol=ttrpc version=3 Jan 15 01:19:48.987554 kubelet[2903]: W0115 01:19:48.986637 2903 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.7.78:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.0.7.78:6443: connect: connection refused Jan 15 01:19:48.987554 kubelet[2903]: E0115 01:19:48.986720 2903 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.0.7.78:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.7.78:6443: connect: connection refused" logger="UnhandledError" Jan 15 01:19:48.999293 systemd[1]: Started cri-containerd-9093d92bc22a002b0a04f61e69336bc6100d020a7f24d5d044037bb29d99be8f.scope - libcontainer container 9093d92bc22a002b0a04f61e69336bc6100d020a7f24d5d044037bb29d99be8f. Jan 15 01:19:49.012611 systemd[1]: Started cri-containerd-e1a9a69f281c4939cb444ed1bfc85b218a064c52a9c47a646377d7c719b09116.scope - libcontainer container e1a9a69f281c4939cb444ed1bfc85b218a064c52a9c47a646377d7c719b09116. Jan 15 01:19:49.015658 containerd[1715]: time="2026-01-15T01:19:49.013955661Z" level=info msg="StartContainer for \"3dc3cd569446c1b42cdca87f99320569116bd28fe2717e239d4e7ffbe1e35e8b\" returns successfully" Jan 15 01:19:49.018861 kubelet[2903]: I0115 01:19:49.018834 2903 kubelet_node_status.go:75] "Attempting to register node" node="ci-4515-1-0-n-d76f075714" Jan 15 01:19:49.019833 kubelet[2903]: E0115 01:19:49.019806 2903 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.7.78:6443/api/v1/nodes\": dial tcp 10.0.7.78:6443: connect: connection refused" node="ci-4515-1-0-n-d76f075714" Jan 15 01:19:49.026000 audit: BPF prog-id=103 op=LOAD Jan 15 01:19:49.027000 audit: BPF prog-id=104 op=LOAD Jan 15 01:19:49.027000 audit[3096]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=2986 pid=3096 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:49.027000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3930393364393262633232613030326230613034663631653639333336 Jan 15 01:19:49.027000 audit: BPF prog-id=104 op=UNLOAD Jan 15 01:19:49.027000 audit[3096]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2986 pid=3096 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:49.027000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3930393364393262633232613030326230613034663631653639333336 Jan 15 01:19:49.027000 audit: BPF prog-id=105 op=LOAD Jan 15 01:19:49.027000 audit[3096]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=2986 pid=3096 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:49.027000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3930393364393262633232613030326230613034663631653639333336 Jan 15 01:19:49.027000 audit: BPF prog-id=106 op=LOAD Jan 15 01:19:49.027000 audit[3096]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=2986 pid=3096 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:49.027000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3930393364393262633232613030326230613034663631653639333336 Jan 15 01:19:49.027000 audit: BPF prog-id=106 op=UNLOAD Jan 15 01:19:49.027000 audit[3096]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2986 pid=3096 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:49.027000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3930393364393262633232613030326230613034663631653639333336 Jan 15 01:19:49.027000 audit: BPF prog-id=105 op=UNLOAD Jan 15 01:19:49.027000 audit[3096]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2986 pid=3096 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:49.027000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3930393364393262633232613030326230613034663631653639333336 Jan 15 01:19:49.027000 audit: BPF prog-id=107 op=LOAD Jan 15 01:19:49.027000 audit[3096]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=2986 pid=3096 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:49.027000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3930393364393262633232613030326230613034663631653639333336 Jan 15 01:19:49.051000 audit: BPF prog-id=108 op=LOAD Jan 15 01:19:49.053000 audit: BPF prog-id=109 op=LOAD Jan 15 01:19:49.053000 audit[3105]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000178238 a2=98 a3=0 items=0 ppid=2974 pid=3105 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:49.053000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6531613961363966323831633439333963623434346564316266633835 Jan 15 01:19:49.053000 audit: BPF prog-id=109 op=UNLOAD Jan 15 01:19:49.053000 audit[3105]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2974 pid=3105 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:49.053000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6531613961363966323831633439333963623434346564316266633835 Jan 15 01:19:49.053000 audit: BPF prog-id=110 op=LOAD Jan 15 01:19:49.053000 audit[3105]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000178488 a2=98 a3=0 items=0 ppid=2974 pid=3105 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:49.053000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6531613961363966323831633439333963623434346564316266633835 Jan 15 01:19:49.054000 audit: BPF prog-id=111 op=LOAD Jan 15 01:19:49.054000 audit[3105]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000178218 a2=98 a3=0 items=0 ppid=2974 pid=3105 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:49.054000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6531613961363966323831633439333963623434346564316266633835 Jan 15 01:19:49.054000 audit: BPF prog-id=111 op=UNLOAD Jan 15 01:19:49.054000 audit[3105]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2974 pid=3105 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:49.054000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6531613961363966323831633439333963623434346564316266633835 Jan 15 01:19:49.054000 audit: BPF prog-id=110 op=UNLOAD Jan 15 01:19:49.054000 audit[3105]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2974 pid=3105 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:49.054000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6531613961363966323831633439333963623434346564316266633835 Jan 15 01:19:49.054000 audit: BPF prog-id=112 op=LOAD Jan 15 01:19:49.054000 audit[3105]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001786e8 a2=98 a3=0 items=0 ppid=2974 pid=3105 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:49.054000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6531613961363966323831633439333963623434346564316266633835 Jan 15 01:19:49.089403 containerd[1715]: time="2026-01-15T01:19:49.089352930Z" level=info msg="StartContainer for \"9093d92bc22a002b0a04f61e69336bc6100d020a7f24d5d044037bb29d99be8f\" returns successfully" Jan 15 01:19:49.129192 containerd[1715]: time="2026-01-15T01:19:49.129152529Z" level=info msg="StartContainer for \"e1a9a69f281c4939cb444ed1bfc85b218a064c52a9c47a646377d7c719b09116\" returns successfully" Jan 15 01:19:49.471866 kubelet[2903]: E0115 01:19:49.471835 2903 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4515-1-0-n-d76f075714\" not found" node="ci-4515-1-0-n-d76f075714" Jan 15 01:19:49.478447 kubelet[2903]: E0115 01:19:49.478421 2903 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4515-1-0-n-d76f075714\" not found" node="ci-4515-1-0-n-d76f075714" Jan 15 01:19:49.480503 kubelet[2903]: E0115 01:19:49.480482 2903 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4515-1-0-n-d76f075714\" not found" node="ci-4515-1-0-n-d76f075714" Jan 15 01:19:50.483337 kubelet[2903]: E0115 01:19:50.483313 2903 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4515-1-0-n-d76f075714\" not found" node="ci-4515-1-0-n-d76f075714" Jan 15 01:19:50.485179 kubelet[2903]: E0115 01:19:50.483964 2903 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4515-1-0-n-d76f075714\" not found" node="ci-4515-1-0-n-d76f075714" Jan 15 01:19:50.621625 kubelet[2903]: I0115 01:19:50.621599 2903 kubelet_node_status.go:75] "Attempting to register node" node="ci-4515-1-0-n-d76f075714" Jan 15 01:19:50.690466 kubelet[2903]: E0115 01:19:50.690430 2903 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4515-1-0-n-d76f075714\" not found" node="ci-4515-1-0-n-d76f075714" Jan 15 01:19:50.891589 kubelet[2903]: I0115 01:19:50.891495 2903 kubelet_node_status.go:78] "Successfully registered node" node="ci-4515-1-0-n-d76f075714" Jan 15 01:19:50.891589 kubelet[2903]: E0115 01:19:50.891534 2903 kubelet_node_status.go:548] "Error updating node status, will retry" err="error getting node \"ci-4515-1-0-n-d76f075714\": node \"ci-4515-1-0-n-d76f075714\" not found" Jan 15 01:19:50.912403 kubelet[2903]: E0115 01:19:50.912373 2903 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4515-1-0-n-d76f075714\" not found" Jan 15 01:19:51.012986 kubelet[2903]: E0115 01:19:51.012946 2903 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4515-1-0-n-d76f075714\" not found" Jan 15 01:19:51.114279 kubelet[2903]: E0115 01:19:51.114233 2903 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4515-1-0-n-d76f075714\" not found" Jan 15 01:19:51.214390 kubelet[2903]: E0115 01:19:51.214342 2903 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4515-1-0-n-d76f075714\" not found" Jan 15 01:19:51.315174 kubelet[2903]: E0115 01:19:51.315122 2903 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4515-1-0-n-d76f075714\" not found" Jan 15 01:19:51.415660 kubelet[2903]: E0115 01:19:51.415616 2903 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4515-1-0-n-d76f075714\" not found" Jan 15 01:19:51.484374 kubelet[2903]: E0115 01:19:51.483896 2903 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4515-1-0-n-d76f075714\" not found" node="ci-4515-1-0-n-d76f075714" Jan 15 01:19:51.515971 kubelet[2903]: E0115 01:19:51.515936 2903 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4515-1-0-n-d76f075714\" not found" Jan 15 01:19:51.616815 kubelet[2903]: E0115 01:19:51.616776 2903 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4515-1-0-n-d76f075714\" not found" Jan 15 01:19:51.732752 kubelet[2903]: I0115 01:19:51.731719 2903 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4515-1-0-n-d76f075714" Jan 15 01:19:51.742716 kubelet[2903]: I0115 01:19:51.742607 2903 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4515-1-0-n-d76f075714" Jan 15 01:19:51.749815 kubelet[2903]: I0115 01:19:51.749647 2903 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4515-1-0-n-d76f075714" Jan 15 01:19:52.419398 kubelet[2903]: I0115 01:19:52.419349 2903 apiserver.go:52] "Watching apiserver" Jan 15 01:19:52.433050 kubelet[2903]: I0115 01:19:52.432994 2903 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Jan 15 01:19:53.003823 systemd[1]: Reload requested from client PID 3172 ('systemctl') (unit session-7.scope)... Jan 15 01:19:53.003841 systemd[1]: Reloading... Jan 15 01:19:53.093089 zram_generator::config[3218]: No configuration found. Jan 15 01:19:53.330489 systemd[1]: Reloading finished in 325 ms. Jan 15 01:19:53.356205 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Jan 15 01:19:53.367422 systemd[1]: kubelet.service: Deactivated successfully. Jan 15 01:19:53.367798 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 15 01:19:53.371562 kernel: kauditd_printk_skb: 204 callbacks suppressed Jan 15 01:19:53.371648 kernel: audit: type=1131 audit(1768439993.366:399): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:19:53.366000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:19:53.369926 systemd[1]: kubelet.service: Consumed 724ms CPU time, 131.1M memory peak. Jan 15 01:19:53.373788 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 15 01:19:53.373000 audit: BPF prog-id=113 op=LOAD Jan 15 01:19:53.377085 kernel: audit: type=1334 audit(1768439993.373:400): prog-id=113 op=LOAD Jan 15 01:19:53.373000 audit: BPF prog-id=114 op=LOAD Jan 15 01:19:53.380717 kernel: audit: type=1334 audit(1768439993.373:401): prog-id=114 op=LOAD Jan 15 01:19:53.380765 kernel: audit: type=1334 audit(1768439993.373:402): prog-id=75 op=UNLOAD Jan 15 01:19:53.373000 audit: BPF prog-id=75 op=UNLOAD Jan 15 01:19:53.383956 kernel: audit: type=1334 audit(1768439993.373:403): prog-id=76 op=UNLOAD Jan 15 01:19:53.384030 kernel: audit: type=1334 audit(1768439993.374:404): prog-id=115 op=LOAD Jan 15 01:19:53.373000 audit: BPF prog-id=76 op=UNLOAD Jan 15 01:19:53.374000 audit: BPF prog-id=115 op=LOAD Jan 15 01:19:53.374000 audit: BPF prog-id=82 op=UNLOAD Jan 15 01:19:53.386447 kernel: audit: type=1334 audit(1768439993.374:405): prog-id=82 op=UNLOAD Jan 15 01:19:53.386500 kernel: audit: type=1334 audit(1768439993.375:406): prog-id=116 op=LOAD Jan 15 01:19:53.375000 audit: BPF prog-id=116 op=LOAD Jan 15 01:19:53.387517 kernel: audit: type=1334 audit(1768439993.375:407): prog-id=81 op=UNLOAD Jan 15 01:19:53.375000 audit: BPF prog-id=81 op=UNLOAD Jan 15 01:19:53.375000 audit: BPF prog-id=117 op=LOAD Jan 15 01:19:53.390046 kernel: audit: type=1334 audit(1768439993.375:408): prog-id=117 op=LOAD Jan 15 01:19:53.375000 audit: BPF prog-id=69 op=UNLOAD Jan 15 01:19:53.375000 audit: BPF prog-id=118 op=LOAD Jan 15 01:19:53.375000 audit: BPF prog-id=119 op=LOAD Jan 15 01:19:53.375000 audit: BPF prog-id=70 op=UNLOAD Jan 15 01:19:53.375000 audit: BPF prog-id=71 op=UNLOAD Jan 15 01:19:53.376000 audit: BPF prog-id=120 op=LOAD Jan 15 01:19:53.376000 audit: BPF prog-id=66 op=UNLOAD Jan 15 01:19:53.376000 audit: BPF prog-id=121 op=LOAD Jan 15 01:19:53.376000 audit: BPF prog-id=122 op=LOAD Jan 15 01:19:53.376000 audit: BPF prog-id=67 op=UNLOAD Jan 15 01:19:53.376000 audit: BPF prog-id=68 op=UNLOAD Jan 15 01:19:53.378000 audit: BPF prog-id=123 op=LOAD Jan 15 01:19:53.378000 audit: BPF prog-id=78 op=UNLOAD Jan 15 01:19:53.378000 audit: BPF prog-id=124 op=LOAD Jan 15 01:19:53.378000 audit: BPF prog-id=125 op=LOAD Jan 15 01:19:53.378000 audit: BPF prog-id=79 op=UNLOAD Jan 15 01:19:53.378000 audit: BPF prog-id=80 op=UNLOAD Jan 15 01:19:53.378000 audit: BPF prog-id=126 op=LOAD Jan 15 01:19:53.378000 audit: BPF prog-id=72 op=UNLOAD Jan 15 01:19:53.380000 audit: BPF prog-id=127 op=LOAD Jan 15 01:19:53.380000 audit: BPF prog-id=128 op=LOAD Jan 15 01:19:53.380000 audit: BPF prog-id=73 op=UNLOAD Jan 15 01:19:53.380000 audit: BPF prog-id=74 op=UNLOAD Jan 15 01:19:53.380000 audit: BPF prog-id=129 op=LOAD Jan 15 01:19:53.380000 audit: BPF prog-id=77 op=UNLOAD Jan 15 01:19:53.380000 audit: BPF prog-id=130 op=LOAD Jan 15 01:19:53.380000 audit: BPF prog-id=63 op=UNLOAD Jan 15 01:19:53.380000 audit: BPF prog-id=131 op=LOAD Jan 15 01:19:53.380000 audit: BPF prog-id=132 op=LOAD Jan 15 01:19:53.380000 audit: BPF prog-id=64 op=UNLOAD Jan 15 01:19:53.380000 audit: BPF prog-id=65 op=UNLOAD Jan 15 01:19:53.563192 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 15 01:19:53.562000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:19:53.573358 (kubelet)[3275]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 15 01:19:53.621100 kubelet[3275]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 15 01:19:53.621100 kubelet[3275]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Jan 15 01:19:53.621100 kubelet[3275]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 15 01:19:53.621100 kubelet[3275]: I0115 01:19:53.620228 3275 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 15 01:19:53.631793 kubelet[3275]: I0115 01:19:53.631316 3275 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Jan 15 01:19:53.631793 kubelet[3275]: I0115 01:19:53.631339 3275 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 15 01:19:53.631793 kubelet[3275]: I0115 01:19:53.631587 3275 server.go:954] "Client rotation is on, will bootstrap in background" Jan 15 01:19:53.633950 kubelet[3275]: I0115 01:19:53.633899 3275 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Jan 15 01:19:53.637507 kubelet[3275]: I0115 01:19:53.637049 3275 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 15 01:19:53.648077 kubelet[3275]: I0115 01:19:53.648003 3275 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 15 01:19:53.651036 kubelet[3275]: I0115 01:19:53.650809 3275 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jan 15 01:19:53.651163 kubelet[3275]: I0115 01:19:53.651093 3275 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 15 01:19:53.651305 kubelet[3275]: I0115 01:19:53.651118 3275 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4515-1-0-n-d76f075714","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 15 01:19:53.651379 kubelet[3275]: I0115 01:19:53.651315 3275 topology_manager.go:138] "Creating topology manager with none policy" Jan 15 01:19:53.651379 kubelet[3275]: I0115 01:19:53.651325 3275 container_manager_linux.go:304] "Creating device plugin manager" Jan 15 01:19:53.651430 kubelet[3275]: I0115 01:19:53.651372 3275 state_mem.go:36] "Initialized new in-memory state store" Jan 15 01:19:53.651587 kubelet[3275]: I0115 01:19:53.651575 3275 kubelet.go:446] "Attempting to sync node with API server" Jan 15 01:19:53.651616 kubelet[3275]: I0115 01:19:53.651598 3275 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 15 01:19:53.651643 kubelet[3275]: I0115 01:19:53.651634 3275 kubelet.go:352] "Adding apiserver pod source" Jan 15 01:19:53.651665 kubelet[3275]: I0115 01:19:53.651645 3275 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 15 01:19:53.656654 kubelet[3275]: I0115 01:19:53.656632 3275 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.1.5" apiVersion="v1" Jan 15 01:19:53.658029 kubelet[3275]: I0115 01:19:53.657125 3275 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 15 01:19:53.658029 kubelet[3275]: I0115 01:19:53.657482 3275 watchdog_linux.go:99] "Systemd watchdog is not enabled" Jan 15 01:19:53.658029 kubelet[3275]: I0115 01:19:53.657506 3275 server.go:1287] "Started kubelet" Jan 15 01:19:53.659628 kubelet[3275]: I0115 01:19:53.659613 3275 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 15 01:19:53.664854 kubelet[3275]: I0115 01:19:53.664819 3275 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Jan 15 01:19:53.665813 kubelet[3275]: I0115 01:19:53.665797 3275 server.go:479] "Adding debug handlers to kubelet server" Jan 15 01:19:53.670203 kubelet[3275]: I0115 01:19:53.670142 3275 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 15 01:19:53.670411 kubelet[3275]: I0115 01:19:53.670398 3275 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 15 01:19:53.671795 kubelet[3275]: E0115 01:19:53.671777 3275 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4515-1-0-n-d76f075714\" not found" Jan 15 01:19:53.672411 kubelet[3275]: I0115 01:19:53.672400 3275 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jan 15 01:19:53.677174 kubelet[3275]: I0115 01:19:53.677159 3275 volume_manager.go:297] "Starting Kubelet Volume Manager" Jan 15 01:19:53.677581 kubelet[3275]: I0115 01:19:53.677570 3275 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Jan 15 01:19:53.677730 kubelet[3275]: I0115 01:19:53.677724 3275 reconciler.go:26] "Reconciler: start to sync state" Jan 15 01:19:53.681034 kubelet[3275]: I0115 01:19:53.680773 3275 factory.go:221] Registration of the systemd container factory successfully Jan 15 01:19:53.681034 kubelet[3275]: I0115 01:19:53.680880 3275 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 15 01:19:53.683840 kubelet[3275]: E0115 01:19:53.683823 3275 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jan 15 01:19:53.685965 kubelet[3275]: I0115 01:19:53.685950 3275 factory.go:221] Registration of the containerd container factory successfully Jan 15 01:19:53.688558 kubelet[3275]: I0115 01:19:53.688538 3275 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 15 01:19:53.689866 kubelet[3275]: I0115 01:19:53.689852 3275 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 15 01:19:53.689942 kubelet[3275]: I0115 01:19:53.689936 3275 status_manager.go:227] "Starting to sync pod status with apiserver" Jan 15 01:19:53.690038 kubelet[3275]: I0115 01:19:53.689991 3275 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Jan 15 01:19:53.690038 kubelet[3275]: I0115 01:19:53.690000 3275 kubelet.go:2382] "Starting kubelet main sync loop" Jan 15 01:19:53.690122 kubelet[3275]: E0115 01:19:53.690110 3275 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 15 01:19:53.744135 kubelet[3275]: I0115 01:19:53.743409 3275 cpu_manager.go:221] "Starting CPU manager" policy="none" Jan 15 01:19:53.744135 kubelet[3275]: I0115 01:19:53.743421 3275 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Jan 15 01:19:53.744135 kubelet[3275]: I0115 01:19:53.743441 3275 state_mem.go:36] "Initialized new in-memory state store" Jan 15 01:19:53.744135 kubelet[3275]: I0115 01:19:53.743601 3275 state_mem.go:88] "Updated default CPUSet" cpuSet="" Jan 15 01:19:53.744135 kubelet[3275]: I0115 01:19:53.743612 3275 state_mem.go:96] "Updated CPUSet assignments" assignments={} Jan 15 01:19:53.744135 kubelet[3275]: I0115 01:19:53.743631 3275 policy_none.go:49] "None policy: Start" Jan 15 01:19:53.744135 kubelet[3275]: I0115 01:19:53.743640 3275 memory_manager.go:186] "Starting memorymanager" policy="None" Jan 15 01:19:53.744135 kubelet[3275]: I0115 01:19:53.743649 3275 state_mem.go:35] "Initializing new in-memory state store" Jan 15 01:19:53.744135 kubelet[3275]: I0115 01:19:53.743768 3275 state_mem.go:75] "Updated machine memory state" Jan 15 01:19:53.750028 kubelet[3275]: I0115 01:19:53.749988 3275 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 15 01:19:53.750351 kubelet[3275]: I0115 01:19:53.750338 3275 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 15 01:19:53.750446 kubelet[3275]: I0115 01:19:53.750414 3275 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 15 01:19:53.751031 kubelet[3275]: I0115 01:19:53.750758 3275 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 15 01:19:53.753614 kubelet[3275]: E0115 01:19:53.753597 3275 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Jan 15 01:19:53.792034 kubelet[3275]: I0115 01:19:53.791293 3275 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4515-1-0-n-d76f075714" Jan 15 01:19:53.792942 kubelet[3275]: I0115 01:19:53.792570 3275 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4515-1-0-n-d76f075714" Jan 15 01:19:53.793343 kubelet[3275]: I0115 01:19:53.792777 3275 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4515-1-0-n-d76f075714" Jan 15 01:19:53.804437 kubelet[3275]: E0115 01:19:53.804392 3275 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4515-1-0-n-d76f075714\" already exists" pod="kube-system/kube-controller-manager-ci-4515-1-0-n-d76f075714" Jan 15 01:19:53.805407 kubelet[3275]: E0115 01:19:53.805381 3275 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4515-1-0-n-d76f075714\" already exists" pod="kube-system/kube-apiserver-ci-4515-1-0-n-d76f075714" Jan 15 01:19:53.805510 kubelet[3275]: E0115 01:19:53.805493 3275 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4515-1-0-n-d76f075714\" already exists" pod="kube-system/kube-scheduler-ci-4515-1-0-n-d76f075714" Jan 15 01:19:53.856953 kubelet[3275]: I0115 01:19:53.856145 3275 kubelet_node_status.go:75] "Attempting to register node" node="ci-4515-1-0-n-d76f075714" Jan 15 01:19:53.868591 kubelet[3275]: I0115 01:19:53.868550 3275 kubelet_node_status.go:124] "Node was previously registered" node="ci-4515-1-0-n-d76f075714" Jan 15 01:19:53.868747 kubelet[3275]: I0115 01:19:53.868652 3275 kubelet_node_status.go:78] "Successfully registered node" node="ci-4515-1-0-n-d76f075714" Jan 15 01:19:53.879972 kubelet[3275]: I0115 01:19:53.878990 3275 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/c12e4e6a98d84f27efaba5c27ae424e3-kubeconfig\") pod \"kube-controller-manager-ci-4515-1-0-n-d76f075714\" (UID: \"c12e4e6a98d84f27efaba5c27ae424e3\") " pod="kube-system/kube-controller-manager-ci-4515-1-0-n-d76f075714" Jan 15 01:19:53.879972 kubelet[3275]: I0115 01:19:53.879217 3275 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/95a27261a5eb27833e5aae5be403ab0c-ca-certs\") pod \"kube-apiserver-ci-4515-1-0-n-d76f075714\" (UID: \"95a27261a5eb27833e5aae5be403ab0c\") " pod="kube-system/kube-apiserver-ci-4515-1-0-n-d76f075714" Jan 15 01:19:53.879972 kubelet[3275]: I0115 01:19:53.879248 3275 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/95a27261a5eb27833e5aae5be403ab0c-k8s-certs\") pod \"kube-apiserver-ci-4515-1-0-n-d76f075714\" (UID: \"95a27261a5eb27833e5aae5be403ab0c\") " pod="kube-system/kube-apiserver-ci-4515-1-0-n-d76f075714" Jan 15 01:19:53.879972 kubelet[3275]: I0115 01:19:53.879272 3275 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/95a27261a5eb27833e5aae5be403ab0c-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4515-1-0-n-d76f075714\" (UID: \"95a27261a5eb27833e5aae5be403ab0c\") " pod="kube-system/kube-apiserver-ci-4515-1-0-n-d76f075714" Jan 15 01:19:53.879972 kubelet[3275]: I0115 01:19:53.879291 3275 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/c12e4e6a98d84f27efaba5c27ae424e3-ca-certs\") pod \"kube-controller-manager-ci-4515-1-0-n-d76f075714\" (UID: \"c12e4e6a98d84f27efaba5c27ae424e3\") " pod="kube-system/kube-controller-manager-ci-4515-1-0-n-d76f075714" Jan 15 01:19:53.880555 kubelet[3275]: I0115 01:19:53.879319 3275 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/c12e4e6a98d84f27efaba5c27ae424e3-flexvolume-dir\") pod \"kube-controller-manager-ci-4515-1-0-n-d76f075714\" (UID: \"c12e4e6a98d84f27efaba5c27ae424e3\") " pod="kube-system/kube-controller-manager-ci-4515-1-0-n-d76f075714" Jan 15 01:19:53.880555 kubelet[3275]: I0115 01:19:53.879335 3275 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/c12e4e6a98d84f27efaba5c27ae424e3-k8s-certs\") pod \"kube-controller-manager-ci-4515-1-0-n-d76f075714\" (UID: \"c12e4e6a98d84f27efaba5c27ae424e3\") " pod="kube-system/kube-controller-manager-ci-4515-1-0-n-d76f075714" Jan 15 01:19:53.880555 kubelet[3275]: I0115 01:19:53.879353 3275 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/c12e4e6a98d84f27efaba5c27ae424e3-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4515-1-0-n-d76f075714\" (UID: \"c12e4e6a98d84f27efaba5c27ae424e3\") " pod="kube-system/kube-controller-manager-ci-4515-1-0-n-d76f075714" Jan 15 01:19:53.880555 kubelet[3275]: I0115 01:19:53.879371 3275 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/73d69104894a5a57f7b5508475370384-kubeconfig\") pod \"kube-scheduler-ci-4515-1-0-n-d76f075714\" (UID: \"73d69104894a5a57f7b5508475370384\") " pod="kube-system/kube-scheduler-ci-4515-1-0-n-d76f075714" Jan 15 01:19:54.652858 kubelet[3275]: I0115 01:19:54.652108 3275 apiserver.go:52] "Watching apiserver" Jan 15 01:19:54.678158 kubelet[3275]: I0115 01:19:54.678099 3275 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Jan 15 01:19:54.718953 kubelet[3275]: I0115 01:19:54.718916 3275 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4515-1-0-n-d76f075714" Jan 15 01:19:54.719778 kubelet[3275]: I0115 01:19:54.719739 3275 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4515-1-0-n-d76f075714" Jan 15 01:19:54.730629 kubelet[3275]: E0115 01:19:54.730361 3275 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4515-1-0-n-d76f075714\" already exists" pod="kube-system/kube-scheduler-ci-4515-1-0-n-d76f075714" Jan 15 01:19:54.732168 kubelet[3275]: E0115 01:19:54.732151 3275 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4515-1-0-n-d76f075714\" already exists" pod="kube-system/kube-apiserver-ci-4515-1-0-n-d76f075714" Jan 15 01:19:54.763432 kubelet[3275]: I0115 01:19:54.763286 3275 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4515-1-0-n-d76f075714" podStartSLOduration=3.763268923 podStartE2EDuration="3.763268923s" podCreationTimestamp="2026-01-15 01:19:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-15 01:19:54.741717338 +0000 UTC m=+1.165157980" watchObservedRunningTime="2026-01-15 01:19:54.763268923 +0000 UTC m=+1.186709565" Jan 15 01:19:54.779794 kubelet[3275]: I0115 01:19:54.778125 3275 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4515-1-0-n-d76f075714" podStartSLOduration=3.778107827 podStartE2EDuration="3.778107827s" podCreationTimestamp="2026-01-15 01:19:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-15 01:19:54.763622844 +0000 UTC m=+1.187063487" watchObservedRunningTime="2026-01-15 01:19:54.778107827 +0000 UTC m=+1.201548465" Jan 15 01:19:54.796229 kubelet[3275]: I0115 01:19:54.795735 3275 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4515-1-0-n-d76f075714" podStartSLOduration=3.795720915 podStartE2EDuration="3.795720915s" podCreationTimestamp="2026-01-15 01:19:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-15 01:19:54.778737611 +0000 UTC m=+1.202178245" watchObservedRunningTime="2026-01-15 01:19:54.795720915 +0000 UTC m=+1.219161559" Jan 15 01:19:57.712896 kubelet[3275]: I0115 01:19:57.710939 3275 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Jan 15 01:19:57.712896 kubelet[3275]: I0115 01:19:57.711403 3275 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Jan 15 01:19:57.713301 containerd[1715]: time="2026-01-15T01:19:57.711276150Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Jan 15 01:19:58.629157 systemd[1]: Created slice kubepods-besteffort-poda7a8fbac_0b1f_4753_9d74_4b06994d0770.slice - libcontainer container kubepods-besteffort-poda7a8fbac_0b1f_4753_9d74_4b06994d0770.slice. Jan 15 01:19:58.711464 kubelet[3275]: I0115 01:19:58.711401 3275 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/a7a8fbac-0b1f-4753-9d74-4b06994d0770-xtables-lock\") pod \"kube-proxy-c2r47\" (UID: \"a7a8fbac-0b1f-4753-9d74-4b06994d0770\") " pod="kube-system/kube-proxy-c2r47" Jan 15 01:19:58.711464 kubelet[3275]: I0115 01:19:58.711435 3275 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a7a8fbac-0b1f-4753-9d74-4b06994d0770-lib-modules\") pod \"kube-proxy-c2r47\" (UID: \"a7a8fbac-0b1f-4753-9d74-4b06994d0770\") " pod="kube-system/kube-proxy-c2r47" Jan 15 01:19:58.711676 kubelet[3275]: I0115 01:19:58.711478 3275 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zs98\" (UniqueName: \"kubernetes.io/projected/a7a8fbac-0b1f-4753-9d74-4b06994d0770-kube-api-access-7zs98\") pod \"kube-proxy-c2r47\" (UID: \"a7a8fbac-0b1f-4753-9d74-4b06994d0770\") " pod="kube-system/kube-proxy-c2r47" Jan 15 01:19:58.711676 kubelet[3275]: I0115 01:19:58.711537 3275 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/a7a8fbac-0b1f-4753-9d74-4b06994d0770-kube-proxy\") pod \"kube-proxy-c2r47\" (UID: \"a7a8fbac-0b1f-4753-9d74-4b06994d0770\") " pod="kube-system/kube-proxy-c2r47" Jan 15 01:19:58.855722 systemd[1]: Created slice kubepods-besteffort-pod47794552_371a_43d0_8342_935d77c1bf13.slice - libcontainer container kubepods-besteffort-pod47794552_371a_43d0_8342_935d77c1bf13.slice. Jan 15 01:19:58.913767 kubelet[3275]: I0115 01:19:58.913668 3275 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/47794552-371a-43d0-8342-935d77c1bf13-var-lib-calico\") pod \"tigera-operator-7dcd859c48-sl5jh\" (UID: \"47794552-371a-43d0-8342-935d77c1bf13\") " pod="tigera-operator/tigera-operator-7dcd859c48-sl5jh" Jan 15 01:19:58.913767 kubelet[3275]: I0115 01:19:58.913742 3275 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dk5x\" (UniqueName: \"kubernetes.io/projected/47794552-371a-43d0-8342-935d77c1bf13-kube-api-access-8dk5x\") pod \"tigera-operator-7dcd859c48-sl5jh\" (UID: \"47794552-371a-43d0-8342-935d77c1bf13\") " pod="tigera-operator/tigera-operator-7dcd859c48-sl5jh" Jan 15 01:19:58.939407 containerd[1715]: time="2026-01-15T01:19:58.939309036Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-c2r47,Uid:a7a8fbac-0b1f-4753-9d74-4b06994d0770,Namespace:kube-system,Attempt:0,}" Jan 15 01:19:58.960439 containerd[1715]: time="2026-01-15T01:19:58.960159855Z" level=info msg="connecting to shim 3520410b08a151ec56461babf7a8380cdb83751639516e43e6600cd006865915" address="unix:///run/containerd/s/4404c8c5b6ea53f9b2c74b92c1d87681ae63282af1485de1b4bed4184df6aa0e" namespace=k8s.io protocol=ttrpc version=3 Jan 15 01:19:58.985377 systemd[1]: Started cri-containerd-3520410b08a151ec56461babf7a8380cdb83751639516e43e6600cd006865915.scope - libcontainer container 3520410b08a151ec56461babf7a8380cdb83751639516e43e6600cd006865915. Jan 15 01:19:58.997974 kernel: kauditd_printk_skb: 32 callbacks suppressed Jan 15 01:19:58.998092 kernel: audit: type=1334 audit(1768439998.994:441): prog-id=133 op=LOAD Jan 15 01:19:58.994000 audit: BPF prog-id=133 op=LOAD Jan 15 01:19:58.994000 audit: BPF prog-id=134 op=LOAD Jan 15 01:19:58.994000 audit[3339]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000178238 a2=98 a3=0 items=0 ppid=3328 pid=3339 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:59.000584 kernel: audit: type=1334 audit(1768439998.994:442): prog-id=134 op=LOAD Jan 15 01:19:59.000660 kernel: audit: type=1300 audit(1768439998.994:442): arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000178238 a2=98 a3=0 items=0 ppid=3328 pid=3339 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:59.005058 kernel: audit: type=1327 audit(1768439998.994:442): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3335323034313062303861313531656335363436316261626637613833 Jan 15 01:19:58.994000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3335323034313062303861313531656335363436316261626637613833 Jan 15 01:19:58.994000 audit: BPF prog-id=134 op=UNLOAD Jan 15 01:19:59.008386 kernel: audit: type=1334 audit(1768439998.994:443): prog-id=134 op=UNLOAD Jan 15 01:19:58.994000 audit[3339]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3328 pid=3339 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:59.010534 kernel: audit: type=1300 audit(1768439998.994:443): arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3328 pid=3339 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:58.994000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3335323034313062303861313531656335363436316261626637613833 Jan 15 01:19:59.014328 kernel: audit: type=1327 audit(1768439998.994:443): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3335323034313062303861313531656335363436316261626637613833 Jan 15 01:19:58.994000 audit: BPF prog-id=135 op=LOAD Jan 15 01:19:59.017478 kernel: audit: type=1334 audit(1768439998.994:444): prog-id=135 op=LOAD Jan 15 01:19:58.994000 audit[3339]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000178488 a2=98 a3=0 items=0 ppid=3328 pid=3339 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:59.019455 kernel: audit: type=1300 audit(1768439998.994:444): arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000178488 a2=98 a3=0 items=0 ppid=3328 pid=3339 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:58.994000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3335323034313062303861313531656335363436316261626637613833 Jan 15 01:19:59.023660 kernel: audit: type=1327 audit(1768439998.994:444): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3335323034313062303861313531656335363436316261626637613833 Jan 15 01:19:58.994000 audit: BPF prog-id=136 op=LOAD Jan 15 01:19:58.994000 audit[3339]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000178218 a2=98 a3=0 items=0 ppid=3328 pid=3339 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:58.994000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3335323034313062303861313531656335363436316261626637613833 Jan 15 01:19:58.994000 audit: BPF prog-id=136 op=UNLOAD Jan 15 01:19:58.994000 audit[3339]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3328 pid=3339 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:58.994000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3335323034313062303861313531656335363436316261626637613833 Jan 15 01:19:58.994000 audit: BPF prog-id=135 op=UNLOAD Jan 15 01:19:58.994000 audit[3339]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3328 pid=3339 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:58.994000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3335323034313062303861313531656335363436316261626637613833 Jan 15 01:19:58.994000 audit: BPF prog-id=137 op=LOAD Jan 15 01:19:58.994000 audit[3339]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001786e8 a2=98 a3=0 items=0 ppid=3328 pid=3339 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:58.994000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3335323034313062303861313531656335363436316261626637613833 Jan 15 01:19:59.030605 containerd[1715]: time="2026-01-15T01:19:59.030570602Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-c2r47,Uid:a7a8fbac-0b1f-4753-9d74-4b06994d0770,Namespace:kube-system,Attempt:0,} returns sandbox id \"3520410b08a151ec56461babf7a8380cdb83751639516e43e6600cd006865915\"" Jan 15 01:19:59.038912 containerd[1715]: time="2026-01-15T01:19:59.038882370Z" level=info msg="CreateContainer within sandbox \"3520410b08a151ec56461babf7a8380cdb83751639516e43e6600cd006865915\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Jan 15 01:19:59.051818 containerd[1715]: time="2026-01-15T01:19:59.051782505Z" level=info msg="Container 47d67f95b27456339b0754273a70cc8465782f3fd9bfc12096bea4a881b8d349: CDI devices from CRI Config.CDIDevices: []" Jan 15 01:19:59.060081 containerd[1715]: time="2026-01-15T01:19:59.060049835Z" level=info msg="CreateContainer within sandbox \"3520410b08a151ec56461babf7a8380cdb83751639516e43e6600cd006865915\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"47d67f95b27456339b0754273a70cc8465782f3fd9bfc12096bea4a881b8d349\"" Jan 15 01:19:59.061457 containerd[1715]: time="2026-01-15T01:19:59.061433873Z" level=info msg="StartContainer for \"47d67f95b27456339b0754273a70cc8465782f3fd9bfc12096bea4a881b8d349\"" Jan 15 01:19:59.063278 containerd[1715]: time="2026-01-15T01:19:59.063226070Z" level=info msg="connecting to shim 47d67f95b27456339b0754273a70cc8465782f3fd9bfc12096bea4a881b8d349" address="unix:///run/containerd/s/4404c8c5b6ea53f9b2c74b92c1d87681ae63282af1485de1b4bed4184df6aa0e" protocol=ttrpc version=3 Jan 15 01:19:59.089261 systemd[1]: Started cri-containerd-47d67f95b27456339b0754273a70cc8465782f3fd9bfc12096bea4a881b8d349.scope - libcontainer container 47d67f95b27456339b0754273a70cc8465782f3fd9bfc12096bea4a881b8d349. Jan 15 01:19:59.133000 audit: BPF prog-id=138 op=LOAD Jan 15 01:19:59.133000 audit[3365]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000178488 a2=98 a3=0 items=0 ppid=3328 pid=3365 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:59.133000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3437643637663935623237343536333339623037353432373361373063 Jan 15 01:19:59.133000 audit: BPF prog-id=139 op=LOAD Jan 15 01:19:59.133000 audit[3365]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000178218 a2=98 a3=0 items=0 ppid=3328 pid=3365 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:59.133000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3437643637663935623237343536333339623037353432373361373063 Jan 15 01:19:59.133000 audit: BPF prog-id=139 op=UNLOAD Jan 15 01:19:59.133000 audit[3365]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3328 pid=3365 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:59.133000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3437643637663935623237343536333339623037353432373361373063 Jan 15 01:19:59.133000 audit: BPF prog-id=138 op=UNLOAD Jan 15 01:19:59.133000 audit[3365]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3328 pid=3365 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:59.133000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3437643637663935623237343536333339623037353432373361373063 Jan 15 01:19:59.133000 audit: BPF prog-id=140 op=LOAD Jan 15 01:19:59.133000 audit[3365]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001786e8 a2=98 a3=0 items=0 ppid=3328 pid=3365 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:59.133000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3437643637663935623237343536333339623037353432373361373063 Jan 15 01:19:59.152910 containerd[1715]: time="2026-01-15T01:19:59.152712793Z" level=info msg="StartContainer for \"47d67f95b27456339b0754273a70cc8465782f3fd9bfc12096bea4a881b8d349\" returns successfully" Jan 15 01:19:59.160841 containerd[1715]: time="2026-01-15T01:19:59.160809911Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-sl5jh,Uid:47794552-371a-43d0-8342-935d77c1bf13,Namespace:tigera-operator,Attempt:0,}" Jan 15 01:19:59.183878 containerd[1715]: time="2026-01-15T01:19:59.183742434Z" level=info msg="connecting to shim 56493a1652b1ff3b538e976ba46d0c728db0e46019dc8a7e221703e74df8743d" address="unix:///run/containerd/s/44959c5fd90f1d056ac218737007edb472d070383e0c655fe37eb5ab7401d7c9" namespace=k8s.io protocol=ttrpc version=3 Jan 15 01:19:59.216429 systemd[1]: Started cri-containerd-56493a1652b1ff3b538e976ba46d0c728db0e46019dc8a7e221703e74df8743d.scope - libcontainer container 56493a1652b1ff3b538e976ba46d0c728db0e46019dc8a7e221703e74df8743d. Jan 15 01:19:59.231000 audit: BPF prog-id=141 op=LOAD Jan 15 01:19:59.231000 audit: BPF prog-id=142 op=LOAD Jan 15 01:19:59.231000 audit[3416]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000106238 a2=98 a3=0 items=0 ppid=3405 pid=3416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:59.231000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3536343933613136353262316666336235333865393736626134366430 Jan 15 01:19:59.231000 audit: BPF prog-id=142 op=UNLOAD Jan 15 01:19:59.231000 audit[3416]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3405 pid=3416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:59.231000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3536343933613136353262316666336235333865393736626134366430 Jan 15 01:19:59.232000 audit: BPF prog-id=143 op=LOAD Jan 15 01:19:59.232000 audit[3416]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000106488 a2=98 a3=0 items=0 ppid=3405 pid=3416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:59.232000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3536343933613136353262316666336235333865393736626134366430 Jan 15 01:19:59.232000 audit: BPF prog-id=144 op=LOAD Jan 15 01:19:59.232000 audit[3416]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000106218 a2=98 a3=0 items=0 ppid=3405 pid=3416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:59.232000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3536343933613136353262316666336235333865393736626134366430 Jan 15 01:19:59.232000 audit: BPF prog-id=144 op=UNLOAD Jan 15 01:19:59.232000 audit[3416]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3405 pid=3416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:59.232000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3536343933613136353262316666336235333865393736626134366430 Jan 15 01:19:59.232000 audit: BPF prog-id=143 op=UNLOAD Jan 15 01:19:59.232000 audit[3416]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3405 pid=3416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:59.232000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3536343933613136353262316666336235333865393736626134366430 Jan 15 01:19:59.232000 audit: BPF prog-id=145 op=LOAD Jan 15 01:19:59.232000 audit[3416]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001066e8 a2=98 a3=0 items=0 ppid=3405 pid=3416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:59.232000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3536343933613136353262316666336235333865393736626134366430 Jan 15 01:19:59.273000 audit[3467]: NETFILTER_CFG table=mangle:54 family=2 entries=1 op=nft_register_chain pid=3467 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 01:19:59.273000 audit[3467]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffc0f3af0c0 a2=0 a3=7ffc0f3af0ac items=0 ppid=3378 pid=3467 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:59.273000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Jan 15 01:19:59.274000 audit[3469]: NETFILTER_CFG table=nat:55 family=2 entries=1 op=nft_register_chain pid=3469 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 01:19:59.274000 audit[3468]: NETFILTER_CFG table=mangle:56 family=10 entries=1 op=nft_register_chain pid=3468 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 01:19:59.274000 audit[3468]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffd3ae1ce20 a2=0 a3=7ffd3ae1ce0c items=0 ppid=3378 pid=3468 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:59.274000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Jan 15 01:19:59.274000 audit[3469]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffd6213e530 a2=0 a3=7ffd6213e51c items=0 ppid=3378 pid=3469 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:59.274000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Jan 15 01:19:59.276000 audit[3471]: NETFILTER_CFG table=filter:57 family=2 entries=1 op=nft_register_chain pid=3471 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 01:19:59.277000 audit[3470]: NETFILTER_CFG table=nat:58 family=10 entries=1 op=nft_register_chain pid=3470 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 01:19:59.277000 audit[3470]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffc12afcae0 a2=0 a3=7ffc12afcacc items=0 ppid=3378 pid=3470 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:59.277000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Jan 15 01:19:59.276000 audit[3471]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffe0716f970 a2=0 a3=7ffe0716f95c items=0 ppid=3378 pid=3471 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:59.276000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Jan 15 01:19:59.279000 audit[3472]: NETFILTER_CFG table=filter:59 family=10 entries=1 op=nft_register_chain pid=3472 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 01:19:59.279000 audit[3472]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7fffaa6e2280 a2=0 a3=7fffaa6e226c items=0 ppid=3378 pid=3472 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:59.279000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Jan 15 01:19:59.301835 containerd[1715]: time="2026-01-15T01:19:59.301696316Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-sl5jh,Uid:47794552-371a-43d0-8342-935d77c1bf13,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"56493a1652b1ff3b538e976ba46d0c728db0e46019dc8a7e221703e74df8743d\"" Jan 15 01:19:59.304263 containerd[1715]: time="2026-01-15T01:19:59.304202697Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\"" Jan 15 01:19:59.382000 audit[3480]: NETFILTER_CFG table=filter:60 family=2 entries=1 op=nft_register_chain pid=3480 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 01:19:59.382000 audit[3480]: SYSCALL arch=c000003e syscall=46 success=yes exit=108 a0=3 a1=7ffd8d490500 a2=0 a3=7ffd8d4904ec items=0 ppid=3378 pid=3480 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:59.382000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Jan 15 01:19:59.386000 audit[3482]: NETFILTER_CFG table=filter:61 family=2 entries=1 op=nft_register_rule pid=3482 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 01:19:59.386000 audit[3482]: SYSCALL arch=c000003e syscall=46 success=yes exit=752 a0=3 a1=7fff588ccd20 a2=0 a3=7fff588ccd0c items=0 ppid=3378 pid=3482 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:59.386000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C652073657276696365 Jan 15 01:19:59.389000 audit[3485]: NETFILTER_CFG table=filter:62 family=2 entries=1 op=nft_register_rule pid=3485 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 01:19:59.389000 audit[3485]: SYSCALL arch=c000003e syscall=46 success=yes exit=752 a0=3 a1=7ffee04b4e60 a2=0 a3=7ffee04b4e4c items=0 ppid=3378 pid=3485 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:59.389000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C65207365727669 Jan 15 01:19:59.391000 audit[3486]: NETFILTER_CFG table=filter:63 family=2 entries=1 op=nft_register_chain pid=3486 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 01:19:59.391000 audit[3486]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffcc8eeb020 a2=0 a3=7ffcc8eeb00c items=0 ppid=3378 pid=3486 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:59.391000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Jan 15 01:19:59.393000 audit[3488]: NETFILTER_CFG table=filter:64 family=2 entries=1 op=nft_register_rule pid=3488 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 01:19:59.393000 audit[3488]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7fffa9966390 a2=0 a3=7fffa996637c items=0 ppid=3378 pid=3488 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:59.393000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Jan 15 01:19:59.394000 audit[3489]: NETFILTER_CFG table=filter:65 family=2 entries=1 op=nft_register_chain pid=3489 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 01:19:59.394000 audit[3489]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffe1d47f8f0 a2=0 a3=7ffe1d47f8dc items=0 ppid=3378 pid=3489 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:59.394000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D740066696C746572 Jan 15 01:19:59.397000 audit[3491]: NETFILTER_CFG table=filter:66 family=2 entries=1 op=nft_register_rule pid=3491 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 01:19:59.397000 audit[3491]: SYSCALL arch=c000003e syscall=46 success=yes exit=744 a0=3 a1=7fff9eb26b50 a2=0 a3=7fff9eb26b3c items=0 ppid=3378 pid=3491 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:59.397000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D Jan 15 01:19:59.401000 audit[3494]: NETFILTER_CFG table=filter:67 family=2 entries=1 op=nft_register_rule pid=3494 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 01:19:59.401000 audit[3494]: SYSCALL arch=c000003e syscall=46 success=yes exit=744 a0=3 a1=7ffcf8719db0 a2=0 a3=7ffcf8719d9c items=0 ppid=3378 pid=3494 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:59.401000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D53 Jan 15 01:19:59.403000 audit[3495]: NETFILTER_CFG table=filter:68 family=2 entries=1 op=nft_register_chain pid=3495 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 01:19:59.403000 audit[3495]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffd4cf8bf50 a2=0 a3=7ffd4cf8bf3c items=0 ppid=3378 pid=3495 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:59.403000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D464F5257415244002D740066696C746572 Jan 15 01:19:59.405000 audit[3497]: NETFILTER_CFG table=filter:69 family=2 entries=1 op=nft_register_rule pid=3497 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 01:19:59.405000 audit[3497]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffe41076b40 a2=0 a3=7ffe41076b2c items=0 ppid=3378 pid=3497 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:59.405000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Jan 15 01:19:59.406000 audit[3498]: NETFILTER_CFG table=filter:70 family=2 entries=1 op=nft_register_chain pid=3498 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 01:19:59.406000 audit[3498]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffc8d12dcd0 a2=0 a3=7ffc8d12dcbc items=0 ppid=3378 pid=3498 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:59.406000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Jan 15 01:19:59.409000 audit[3500]: NETFILTER_CFG table=filter:71 family=2 entries=1 op=nft_register_rule pid=3500 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 01:19:59.409000 audit[3500]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffcda90ebd0 a2=0 a3=7ffcda90ebbc items=0 ppid=3378 pid=3500 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:59.409000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Jan 15 01:19:59.413000 audit[3503]: NETFILTER_CFG table=filter:72 family=2 entries=1 op=nft_register_rule pid=3503 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 01:19:59.413000 audit[3503]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffe00a8b1e0 a2=0 a3=7ffe00a8b1cc items=0 ppid=3378 pid=3503 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:59.413000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Jan 15 01:19:59.418000 audit[3506]: NETFILTER_CFG table=filter:73 family=2 entries=1 op=nft_register_rule pid=3506 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 01:19:59.418000 audit[3506]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7fffde356ca0 a2=0 a3=7fffde356c8c items=0 ppid=3378 pid=3506 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:59.418000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D Jan 15 01:19:59.419000 audit[3507]: NETFILTER_CFG table=nat:74 family=2 entries=1 op=nft_register_chain pid=3507 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 01:19:59.419000 audit[3507]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffcb946f8b0 a2=0 a3=7ffcb946f89c items=0 ppid=3378 pid=3507 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:59.419000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D74006E6174 Jan 15 01:19:59.423000 audit[3509]: NETFILTER_CFG table=nat:75 family=2 entries=1 op=nft_register_rule pid=3509 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 01:19:59.423000 audit[3509]: SYSCALL arch=c000003e syscall=46 success=yes exit=524 a0=3 a1=7fff862e0d30 a2=0 a3=7fff862e0d1c items=0 ppid=3378 pid=3509 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:59.423000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 15 01:19:59.428000 audit[3512]: NETFILTER_CFG table=nat:76 family=2 entries=1 op=nft_register_rule pid=3512 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 01:19:59.428000 audit[3512]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffd2d3df430 a2=0 a3=7ffd2d3df41c items=0 ppid=3378 pid=3512 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:59.428000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 15 01:19:59.430000 audit[3513]: NETFILTER_CFG table=nat:77 family=2 entries=1 op=nft_register_chain pid=3513 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 01:19:59.430000 audit[3513]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffc66efe810 a2=0 a3=7ffc66efe7fc items=0 ppid=3378 pid=3513 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:59.430000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Jan 15 01:19:59.432000 audit[3515]: NETFILTER_CFG table=nat:78 family=2 entries=1 op=nft_register_rule pid=3515 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 01:19:59.432000 audit[3515]: SYSCALL arch=c000003e syscall=46 success=yes exit=532 a0=3 a1=7ffe560cd9a0 a2=0 a3=7ffe560cd98c items=0 ppid=3378 pid=3515 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:59.432000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Jan 15 01:19:59.455000 audit[3521]: NETFILTER_CFG table=filter:79 family=2 entries=8 op=nft_register_rule pid=3521 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 01:19:59.455000 audit[3521]: SYSCALL arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7ffd7da00480 a2=0 a3=7ffd7da0046c items=0 ppid=3378 pid=3521 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:59.455000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 01:19:59.468000 audit[3521]: NETFILTER_CFG table=nat:80 family=2 entries=14 op=nft_register_chain pid=3521 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 01:19:59.468000 audit[3521]: SYSCALL arch=c000003e syscall=46 success=yes exit=5508 a0=3 a1=7ffd7da00480 a2=0 a3=7ffd7da0046c items=0 ppid=3378 pid=3521 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:59.468000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 01:19:59.470000 audit[3526]: NETFILTER_CFG table=filter:81 family=10 entries=1 op=nft_register_chain pid=3526 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 01:19:59.470000 audit[3526]: SYSCALL arch=c000003e syscall=46 success=yes exit=108 a0=3 a1=7fffd2e2eef0 a2=0 a3=7fffd2e2eedc items=0 ppid=3378 pid=3526 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:59.470000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Jan 15 01:19:59.473000 audit[3528]: NETFILTER_CFG table=filter:82 family=10 entries=2 op=nft_register_chain pid=3528 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 01:19:59.473000 audit[3528]: SYSCALL arch=c000003e syscall=46 success=yes exit=836 a0=3 a1=7ffd5e19c090 a2=0 a3=7ffd5e19c07c items=0 ppid=3378 pid=3528 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:59.473000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C6520736572766963 Jan 15 01:19:59.478000 audit[3531]: NETFILTER_CFG table=filter:83 family=10 entries=1 op=nft_register_rule pid=3531 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 01:19:59.478000 audit[3531]: SYSCALL arch=c000003e syscall=46 success=yes exit=752 a0=3 a1=7ffd4d767d80 a2=0 a3=7ffd4d767d6c items=0 ppid=3378 pid=3531 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:59.478000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C652073657276 Jan 15 01:19:59.479000 audit[3532]: NETFILTER_CFG table=filter:84 family=10 entries=1 op=nft_register_chain pid=3532 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 01:19:59.479000 audit[3532]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffd1ab80df0 a2=0 a3=7ffd1ab80ddc items=0 ppid=3378 pid=3532 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:59.479000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Jan 15 01:19:59.482000 audit[3534]: NETFILTER_CFG table=filter:85 family=10 entries=1 op=nft_register_rule pid=3534 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 01:19:59.482000 audit[3534]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffea846c660 a2=0 a3=7ffea846c64c items=0 ppid=3378 pid=3534 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:59.482000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Jan 15 01:19:59.483000 audit[3535]: NETFILTER_CFG table=filter:86 family=10 entries=1 op=nft_register_chain pid=3535 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 01:19:59.483000 audit[3535]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffc7f7a92d0 a2=0 a3=7ffc7f7a92bc items=0 ppid=3378 pid=3535 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:59.483000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D740066696C746572 Jan 15 01:19:59.486000 audit[3537]: NETFILTER_CFG table=filter:87 family=10 entries=1 op=nft_register_rule pid=3537 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 01:19:59.486000 audit[3537]: SYSCALL arch=c000003e syscall=46 success=yes exit=744 a0=3 a1=7ffd80971520 a2=0 a3=7ffd8097150c items=0 ppid=3378 pid=3537 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:59.486000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B554245 Jan 15 01:19:59.490000 audit[3540]: NETFILTER_CFG table=filter:88 family=10 entries=2 op=nft_register_chain pid=3540 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 01:19:59.490000 audit[3540]: SYSCALL arch=c000003e syscall=46 success=yes exit=828 a0=3 a1=7ffcaf1b31d0 a2=0 a3=7ffcaf1b31bc items=0 ppid=3378 pid=3540 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:59.490000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D Jan 15 01:19:59.491000 audit[3541]: NETFILTER_CFG table=filter:89 family=10 entries=1 op=nft_register_chain pid=3541 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 01:19:59.491000 audit[3541]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffee4407d60 a2=0 a3=7ffee4407d4c items=0 ppid=3378 pid=3541 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:59.491000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D464F5257415244002D740066696C746572 Jan 15 01:19:59.495000 audit[3543]: NETFILTER_CFG table=filter:90 family=10 entries=1 op=nft_register_rule pid=3543 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 01:19:59.495000 audit[3543]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffeb4165dc0 a2=0 a3=7ffeb4165dac items=0 ppid=3378 pid=3543 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:59.495000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Jan 15 01:19:59.497000 audit[3544]: NETFILTER_CFG table=filter:91 family=10 entries=1 op=nft_register_chain pid=3544 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 01:19:59.497000 audit[3544]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffcf3899170 a2=0 a3=7ffcf389915c items=0 ppid=3378 pid=3544 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:59.497000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Jan 15 01:19:59.500000 audit[3546]: NETFILTER_CFG table=filter:92 family=10 entries=1 op=nft_register_rule pid=3546 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 01:19:59.500000 audit[3546]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffe7adff5d0 a2=0 a3=7ffe7adff5bc items=0 ppid=3378 pid=3546 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:59.500000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Jan 15 01:19:59.507000 audit[3549]: NETFILTER_CFG table=filter:93 family=10 entries=1 op=nft_register_rule pid=3549 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 01:19:59.507000 audit[3549]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffce64307c0 a2=0 a3=7ffce64307ac items=0 ppid=3378 pid=3549 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:59.507000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D Jan 15 01:19:59.511000 audit[3552]: NETFILTER_CFG table=filter:94 family=10 entries=1 op=nft_register_rule pid=3552 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 01:19:59.511000 audit[3552]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffe3d400cc0 a2=0 a3=7ffe3d400cac items=0 ppid=3378 pid=3552 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:59.511000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C Jan 15 01:19:59.513000 audit[3553]: NETFILTER_CFG table=nat:95 family=10 entries=1 op=nft_register_chain pid=3553 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 01:19:59.513000 audit[3553]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffd0d61d350 a2=0 a3=7ffd0d61d33c items=0 ppid=3378 pid=3553 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:59.513000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D74006E6174 Jan 15 01:19:59.516000 audit[3555]: NETFILTER_CFG table=nat:96 family=10 entries=1 op=nft_register_rule pid=3555 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 01:19:59.516000 audit[3555]: SYSCALL arch=c000003e syscall=46 success=yes exit=524 a0=3 a1=7fffe46c5cf0 a2=0 a3=7fffe46c5cdc items=0 ppid=3378 pid=3555 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:59.516000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 15 01:19:59.521000 audit[3558]: NETFILTER_CFG table=nat:97 family=10 entries=1 op=nft_register_rule pid=3558 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 01:19:59.521000 audit[3558]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7fffcc7c50b0 a2=0 a3=7fffcc7c509c items=0 ppid=3378 pid=3558 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:59.521000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 15 01:19:59.522000 audit[3559]: NETFILTER_CFG table=nat:98 family=10 entries=1 op=nft_register_chain pid=3559 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 01:19:59.522000 audit[3559]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffe9b89b130 a2=0 a3=7ffe9b89b11c items=0 ppid=3378 pid=3559 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:59.522000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Jan 15 01:19:59.525000 audit[3561]: NETFILTER_CFG table=nat:99 family=10 entries=2 op=nft_register_chain pid=3561 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 01:19:59.525000 audit[3561]: SYSCALL arch=c000003e syscall=46 success=yes exit=612 a0=3 a1=7ffea4c38b40 a2=0 a3=7ffea4c38b2c items=0 ppid=3378 pid=3561 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:59.525000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Jan 15 01:19:59.527000 audit[3562]: NETFILTER_CFG table=filter:100 family=10 entries=1 op=nft_register_chain pid=3562 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 01:19:59.527000 audit[3562]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fff19f71180 a2=0 a3=7fff19f7116c items=0 ppid=3378 pid=3562 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:59.527000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4649524557414C4C002D740066696C746572 Jan 15 01:19:59.530000 audit[3564]: NETFILTER_CFG table=filter:101 family=10 entries=1 op=nft_register_rule pid=3564 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 01:19:59.530000 audit[3564]: SYSCALL arch=c000003e syscall=46 success=yes exit=228 a0=3 a1=7ffefe6bfc70 a2=0 a3=7ffefe6bfc5c items=0 ppid=3378 pid=3564 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:59.530000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 15 01:19:59.534000 audit[3567]: NETFILTER_CFG table=filter:102 family=10 entries=1 op=nft_register_rule pid=3567 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 01:19:59.534000 audit[3567]: SYSCALL arch=c000003e syscall=46 success=yes exit=228 a0=3 a1=7ffd5a92a090 a2=0 a3=7ffd5a92a07c items=0 ppid=3378 pid=3567 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:59.534000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 15 01:19:59.541000 audit[3569]: NETFILTER_CFG table=filter:103 family=10 entries=3 op=nft_register_rule pid=3569 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Jan 15 01:19:59.541000 audit[3569]: SYSCALL arch=c000003e syscall=46 success=yes exit=2088 a0=3 a1=7fff71311600 a2=0 a3=7fff713115ec items=0 ppid=3378 pid=3569 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:59.541000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 01:19:59.541000 audit[3569]: NETFILTER_CFG table=nat:104 family=10 entries=7 op=nft_register_chain pid=3569 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Jan 15 01:19:59.541000 audit[3569]: SYSCALL arch=c000003e syscall=46 success=yes exit=2056 a0=3 a1=7fff71311600 a2=0 a3=7fff713115ec items=0 ppid=3378 pid=3569 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:19:59.541000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 01:19:59.837845 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount316194810.mount: Deactivated successfully. Jan 15 01:20:01.898708 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3863848671.mount: Deactivated successfully. Jan 15 01:20:02.366993 containerd[1715]: time="2026-01-15T01:20:02.366412846Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 01:20:02.367409 containerd[1715]: time="2026-01-15T01:20:02.367369776Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.7: active requests=0, bytes read=23558205" Jan 15 01:20:02.368947 containerd[1715]: time="2026-01-15T01:20:02.368922590Z" level=info msg="ImageCreate event name:\"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 01:20:02.374806 containerd[1715]: time="2026-01-15T01:20:02.371359540Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 01:20:02.374806 containerd[1715]: time="2026-01-15T01:20:02.371896108Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.7\" with image id \"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\", repo tag \"quay.io/tigera/operator:v1.38.7\", repo digest \"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\", size \"25057686\" in 3.067633334s" Jan 15 01:20:02.374806 containerd[1715]: time="2026-01-15T01:20:02.371915248Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\" returns image reference \"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\"" Jan 15 01:20:02.374806 containerd[1715]: time="2026-01-15T01:20:02.373717670Z" level=info msg="CreateContainer within sandbox \"56493a1652b1ff3b538e976ba46d0c728db0e46019dc8a7e221703e74df8743d\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Jan 15 01:20:02.384478 containerd[1715]: time="2026-01-15T01:20:02.384443783Z" level=info msg="Container 88cdc27263dc5abba19080600f3cc90db43121bff418f932f79038ebbaade986: CDI devices from CRI Config.CDIDevices: []" Jan 15 01:20:02.386681 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2726900046.mount: Deactivated successfully. Jan 15 01:20:02.393750 containerd[1715]: time="2026-01-15T01:20:02.393713142Z" level=info msg="CreateContainer within sandbox \"56493a1652b1ff3b538e976ba46d0c728db0e46019dc8a7e221703e74df8743d\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"88cdc27263dc5abba19080600f3cc90db43121bff418f932f79038ebbaade986\"" Jan 15 01:20:02.394545 containerd[1715]: time="2026-01-15T01:20:02.394523758Z" level=info msg="StartContainer for \"88cdc27263dc5abba19080600f3cc90db43121bff418f932f79038ebbaade986\"" Jan 15 01:20:02.395348 containerd[1715]: time="2026-01-15T01:20:02.395308478Z" level=info msg="connecting to shim 88cdc27263dc5abba19080600f3cc90db43121bff418f932f79038ebbaade986" address="unix:///run/containerd/s/44959c5fd90f1d056ac218737007edb472d070383e0c655fe37eb5ab7401d7c9" protocol=ttrpc version=3 Jan 15 01:20:02.416270 systemd[1]: Started cri-containerd-88cdc27263dc5abba19080600f3cc90db43121bff418f932f79038ebbaade986.scope - libcontainer container 88cdc27263dc5abba19080600f3cc90db43121bff418f932f79038ebbaade986. Jan 15 01:20:02.426000 audit: BPF prog-id=146 op=LOAD Jan 15 01:20:02.426000 audit: BPF prog-id=147 op=LOAD Jan 15 01:20:02.426000 audit[3578]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000186238 a2=98 a3=0 items=0 ppid=3405 pid=3578 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:02.426000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3838636463323732363364633561626261313930383036303066336363 Jan 15 01:20:02.426000 audit: BPF prog-id=147 op=UNLOAD Jan 15 01:20:02.426000 audit[3578]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3405 pid=3578 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:02.426000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3838636463323732363364633561626261313930383036303066336363 Jan 15 01:20:02.426000 audit: BPF prog-id=148 op=LOAD Jan 15 01:20:02.426000 audit[3578]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000186488 a2=98 a3=0 items=0 ppid=3405 pid=3578 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:02.426000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3838636463323732363364633561626261313930383036303066336363 Jan 15 01:20:02.426000 audit: BPF prog-id=149 op=LOAD Jan 15 01:20:02.426000 audit[3578]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000186218 a2=98 a3=0 items=0 ppid=3405 pid=3578 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:02.426000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3838636463323732363364633561626261313930383036303066336363 Jan 15 01:20:02.427000 audit: BPF prog-id=149 op=UNLOAD Jan 15 01:20:02.427000 audit[3578]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3405 pid=3578 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:02.427000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3838636463323732363364633561626261313930383036303066336363 Jan 15 01:20:02.427000 audit: BPF prog-id=148 op=UNLOAD Jan 15 01:20:02.427000 audit[3578]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3405 pid=3578 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:02.427000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3838636463323732363364633561626261313930383036303066336363 Jan 15 01:20:02.427000 audit: BPF prog-id=150 op=LOAD Jan 15 01:20:02.427000 audit[3578]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001866e8 a2=98 a3=0 items=0 ppid=3405 pid=3578 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:02.427000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3838636463323732363364633561626261313930383036303066336363 Jan 15 01:20:02.444447 containerd[1715]: time="2026-01-15T01:20:02.444419006Z" level=info msg="StartContainer for \"88cdc27263dc5abba19080600f3cc90db43121bff418f932f79038ebbaade986\" returns successfully" Jan 15 01:20:02.745519 kubelet[3275]: I0115 01:20:02.745459 3275 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-c2r47" podStartSLOduration=4.745440186 podStartE2EDuration="4.745440186s" podCreationTimestamp="2026-01-15 01:19:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-15 01:19:59.750464535 +0000 UTC m=+6.173905174" watchObservedRunningTime="2026-01-15 01:20:02.745440186 +0000 UTC m=+9.168880828" Jan 15 01:20:06.434018 sudo[2332]: pam_unix(sudo:session): session closed for user root Jan 15 01:20:06.440073 kernel: kauditd_printk_skb: 224 callbacks suppressed Jan 15 01:20:06.440316 kernel: audit: type=1106 audit(1768440006.433:521): pid=2332 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_unix,pam_permit,pam_systemd acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 15 01:20:06.433000 audit[2332]: USER_END pid=2332 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_unix,pam_permit,pam_systemd acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 15 01:20:06.433000 audit[2332]: CRED_DISP pid=2332 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 15 01:20:06.449050 kernel: audit: type=1104 audit(1768440006.433:522): pid=2332 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 15 01:20:06.537114 sshd[2331]: Connection closed by 4.153.228.146 port 54936 Jan 15 01:20:06.540132 sshd-session[2328]: pam_unix(sshd:session): session closed for user core Jan 15 01:20:06.540000 audit[2328]: USER_END pid=2328 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 01:20:06.544830 systemd[1]: sshd@6-10.0.7.78:22-4.153.228.146:54936.service: Deactivated successfully. Jan 15 01:20:06.547762 systemd[1]: session-7.scope: Deactivated successfully. Jan 15 01:20:06.548141 kernel: audit: type=1106 audit(1768440006.540:523): pid=2328 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 01:20:06.548436 systemd[1]: session-7.scope: Consumed 3.902s CPU time, 227.1M memory peak. Jan 15 01:20:06.540000 audit[2328]: CRED_DISP pid=2328 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 01:20:06.551679 systemd-logind[1682]: Session 7 logged out. Waiting for processes to exit. Jan 15 01:20:06.554253 kernel: audit: type=1104 audit(1768440006.540:524): pid=2328 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 01:20:06.554699 systemd-logind[1682]: Removed session 7. Jan 15 01:20:06.544000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@6-10.0.7.78:22-4.153.228.146:54936 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:20:06.561083 kernel: audit: type=1131 audit(1768440006.544:525): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@6-10.0.7.78:22-4.153.228.146:54936 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:20:07.260552 kubelet[3275]: I0115 01:20:07.260498 3275 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-7dcd859c48-sl5jh" podStartSLOduration=6.19084641 podStartE2EDuration="9.26047492s" podCreationTimestamp="2026-01-15 01:19:58 +0000 UTC" firstStartedPulling="2026-01-15 01:19:59.303064598 +0000 UTC m=+5.726505219" lastFinishedPulling="2026-01-15 01:20:02.372693109 +0000 UTC m=+8.796133729" observedRunningTime="2026-01-15 01:20:02.746433492 +0000 UTC m=+9.169874134" watchObservedRunningTime="2026-01-15 01:20:07.26047492 +0000 UTC m=+13.683915562" Jan 15 01:20:07.366000 audit[3665]: NETFILTER_CFG table=filter:105 family=2 entries=15 op=nft_register_rule pid=3665 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 01:20:07.371076 kernel: audit: type=1325 audit(1768440007.366:526): table=filter:105 family=2 entries=15 op=nft_register_rule pid=3665 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 01:20:07.366000 audit[3665]: SYSCALL arch=c000003e syscall=46 success=yes exit=5992 a0=3 a1=7ffcff51c660 a2=0 a3=7ffcff51c64c items=0 ppid=3378 pid=3665 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:07.378050 kernel: audit: type=1300 audit(1768440007.366:526): arch=c000003e syscall=46 success=yes exit=5992 a0=3 a1=7ffcff51c660 a2=0 a3=7ffcff51c64c items=0 ppid=3378 pid=3665 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:07.366000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 01:20:07.384034 kernel: audit: type=1327 audit(1768440007.366:526): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 01:20:07.372000 audit[3665]: NETFILTER_CFG table=nat:106 family=2 entries=12 op=nft_register_rule pid=3665 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 01:20:07.388036 kernel: audit: type=1325 audit(1768440007.372:527): table=nat:106 family=2 entries=12 op=nft_register_rule pid=3665 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 01:20:07.372000 audit[3665]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffcff51c660 a2=0 a3=0 items=0 ppid=3378 pid=3665 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:07.394029 kernel: audit: type=1300 audit(1768440007.372:527): arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffcff51c660 a2=0 a3=0 items=0 ppid=3378 pid=3665 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:07.372000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 01:20:07.412000 audit[3667]: NETFILTER_CFG table=filter:107 family=2 entries=16 op=nft_register_rule pid=3667 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 01:20:07.412000 audit[3667]: SYSCALL arch=c000003e syscall=46 success=yes exit=5992 a0=3 a1=7ffc0a78b3f0 a2=0 a3=7ffc0a78b3dc items=0 ppid=3378 pid=3667 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:07.412000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 01:20:07.417000 audit[3667]: NETFILTER_CFG table=nat:108 family=2 entries=12 op=nft_register_rule pid=3667 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 01:20:07.417000 audit[3667]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffc0a78b3f0 a2=0 a3=0 items=0 ppid=3378 pid=3667 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:07.417000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 01:20:09.879000 audit[3669]: NETFILTER_CFG table=filter:109 family=2 entries=17 op=nft_register_rule pid=3669 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 01:20:09.879000 audit[3669]: SYSCALL arch=c000003e syscall=46 success=yes exit=6736 a0=3 a1=7ffce31d4280 a2=0 a3=7ffce31d426c items=0 ppid=3378 pid=3669 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:09.879000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 01:20:09.884000 audit[3669]: NETFILTER_CFG table=nat:110 family=2 entries=12 op=nft_register_rule pid=3669 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 01:20:09.884000 audit[3669]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffce31d4280 a2=0 a3=0 items=0 ppid=3378 pid=3669 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:09.884000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 01:20:09.897000 audit[3671]: NETFILTER_CFG table=filter:111 family=2 entries=18 op=nft_register_rule pid=3671 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 01:20:09.897000 audit[3671]: SYSCALL arch=c000003e syscall=46 success=yes exit=6736 a0=3 a1=7ffcef600950 a2=0 a3=7ffcef60093c items=0 ppid=3378 pid=3671 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:09.897000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 01:20:09.904000 audit[3671]: NETFILTER_CFG table=nat:112 family=2 entries=12 op=nft_register_rule pid=3671 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 01:20:09.904000 audit[3671]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffcef600950 a2=0 a3=0 items=0 ppid=3378 pid=3671 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:09.904000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 01:20:10.914000 audit[3673]: NETFILTER_CFG table=filter:113 family=2 entries=19 op=nft_register_rule pid=3673 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 01:20:10.914000 audit[3673]: SYSCALL arch=c000003e syscall=46 success=yes exit=7480 a0=3 a1=7ffd790d41a0 a2=0 a3=7ffd790d418c items=0 ppid=3378 pid=3673 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:10.914000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 01:20:10.918000 audit[3673]: NETFILTER_CFG table=nat:114 family=2 entries=12 op=nft_register_rule pid=3673 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 01:20:10.918000 audit[3673]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffd790d41a0 a2=0 a3=0 items=0 ppid=3378 pid=3673 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:10.918000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 01:20:11.608321 systemd[1]: Created slice kubepods-besteffort-pod56ddef29_6fbd_4362_a6ea_2276a891264e.slice - libcontainer container kubepods-besteffort-pod56ddef29_6fbd_4362_a6ea_2276a891264e.slice. Jan 15 01:20:11.690535 kubelet[3275]: I0115 01:20:11.690497 3275 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d598s\" (UniqueName: \"kubernetes.io/projected/56ddef29-6fbd-4362-a6ea-2276a891264e-kube-api-access-d598s\") pod \"calico-typha-764d546866-6z8pf\" (UID: \"56ddef29-6fbd-4362-a6ea-2276a891264e\") " pod="calico-system/calico-typha-764d546866-6z8pf" Jan 15 01:20:11.691467 kubelet[3275]: I0115 01:20:11.691432 3275 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/56ddef29-6fbd-4362-a6ea-2276a891264e-typha-certs\") pod \"calico-typha-764d546866-6z8pf\" (UID: \"56ddef29-6fbd-4362-a6ea-2276a891264e\") " pod="calico-system/calico-typha-764d546866-6z8pf" Jan 15 01:20:11.691504 kubelet[3275]: I0115 01:20:11.691469 3275 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56ddef29-6fbd-4362-a6ea-2276a891264e-tigera-ca-bundle\") pod \"calico-typha-764d546866-6z8pf\" (UID: \"56ddef29-6fbd-4362-a6ea-2276a891264e\") " pod="calico-system/calico-typha-764d546866-6z8pf" Jan 15 01:20:11.879773 systemd[1]: Created slice kubepods-besteffort-pod12aa595e_1fc9_4709_8018_8dbe9f9bda3a.slice - libcontainer container kubepods-besteffort-pod12aa595e_1fc9_4709_8018_8dbe9f9bda3a.slice. Jan 15 01:20:11.893355 kubelet[3275]: I0115 01:20:11.893149 3275 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/12aa595e-1fc9-4709-8018-8dbe9f9bda3a-lib-modules\") pod \"calico-node-jjvh5\" (UID: \"12aa595e-1fc9-4709-8018-8dbe9f9bda3a\") " pod="calico-system/calico-node-jjvh5" Jan 15 01:20:11.893355 kubelet[3275]: I0115 01:20:11.893216 3275 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/12aa595e-1fc9-4709-8018-8dbe9f9bda3a-var-run-calico\") pod \"calico-node-jjvh5\" (UID: \"12aa595e-1fc9-4709-8018-8dbe9f9bda3a\") " pod="calico-system/calico-node-jjvh5" Jan 15 01:20:11.893355 kubelet[3275]: I0115 01:20:11.893247 3275 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/12aa595e-1fc9-4709-8018-8dbe9f9bda3a-flexvol-driver-host\") pod \"calico-node-jjvh5\" (UID: \"12aa595e-1fc9-4709-8018-8dbe9f9bda3a\") " pod="calico-system/calico-node-jjvh5" Jan 15 01:20:11.893355 kubelet[3275]: I0115 01:20:11.893263 3275 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvd2w\" (UniqueName: \"kubernetes.io/projected/12aa595e-1fc9-4709-8018-8dbe9f9bda3a-kube-api-access-gvd2w\") pod \"calico-node-jjvh5\" (UID: \"12aa595e-1fc9-4709-8018-8dbe9f9bda3a\") " pod="calico-system/calico-node-jjvh5" Jan 15 01:20:11.893355 kubelet[3275]: I0115 01:20:11.893279 3275 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/12aa595e-1fc9-4709-8018-8dbe9f9bda3a-cni-log-dir\") pod \"calico-node-jjvh5\" (UID: \"12aa595e-1fc9-4709-8018-8dbe9f9bda3a\") " pod="calico-system/calico-node-jjvh5" Jan 15 01:20:11.893573 kubelet[3275]: I0115 01:20:11.893294 3275 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/12aa595e-1fc9-4709-8018-8dbe9f9bda3a-tigera-ca-bundle\") pod \"calico-node-jjvh5\" (UID: \"12aa595e-1fc9-4709-8018-8dbe9f9bda3a\") " pod="calico-system/calico-node-jjvh5" Jan 15 01:20:11.893573 kubelet[3275]: I0115 01:20:11.893309 3275 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/12aa595e-1fc9-4709-8018-8dbe9f9bda3a-var-lib-calico\") pod \"calico-node-jjvh5\" (UID: \"12aa595e-1fc9-4709-8018-8dbe9f9bda3a\") " pod="calico-system/calico-node-jjvh5" Jan 15 01:20:11.893573 kubelet[3275]: I0115 01:20:11.893323 3275 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/12aa595e-1fc9-4709-8018-8dbe9f9bda3a-cni-bin-dir\") pod \"calico-node-jjvh5\" (UID: \"12aa595e-1fc9-4709-8018-8dbe9f9bda3a\") " pod="calico-system/calico-node-jjvh5" Jan 15 01:20:11.893573 kubelet[3275]: I0115 01:20:11.893496 3275 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/12aa595e-1fc9-4709-8018-8dbe9f9bda3a-xtables-lock\") pod \"calico-node-jjvh5\" (UID: \"12aa595e-1fc9-4709-8018-8dbe9f9bda3a\") " pod="calico-system/calico-node-jjvh5" Jan 15 01:20:11.893573 kubelet[3275]: I0115 01:20:11.893535 3275 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/12aa595e-1fc9-4709-8018-8dbe9f9bda3a-node-certs\") pod \"calico-node-jjvh5\" (UID: \"12aa595e-1fc9-4709-8018-8dbe9f9bda3a\") " pod="calico-system/calico-node-jjvh5" Jan 15 01:20:11.893675 kubelet[3275]: I0115 01:20:11.893553 3275 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/12aa595e-1fc9-4709-8018-8dbe9f9bda3a-policysync\") pod \"calico-node-jjvh5\" (UID: \"12aa595e-1fc9-4709-8018-8dbe9f9bda3a\") " pod="calico-system/calico-node-jjvh5" Jan 15 01:20:11.893844 kubelet[3275]: I0115 01:20:11.893725 3275 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/12aa595e-1fc9-4709-8018-8dbe9f9bda3a-cni-net-dir\") pod \"calico-node-jjvh5\" (UID: \"12aa595e-1fc9-4709-8018-8dbe9f9bda3a\") " pod="calico-system/calico-node-jjvh5" Jan 15 01:20:11.914876 containerd[1715]: time="2026-01-15T01:20:11.914827967Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-764d546866-6z8pf,Uid:56ddef29-6fbd-4362-a6ea-2276a891264e,Namespace:calico-system,Attempt:0,}" Jan 15 01:20:11.929000 audit[3677]: NETFILTER_CFG table=filter:115 family=2 entries=21 op=nft_register_rule pid=3677 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 01:20:11.931987 kernel: kauditd_printk_skb: 25 callbacks suppressed Jan 15 01:20:11.932065 kernel: audit: type=1325 audit(1768440011.929:536): table=filter:115 family=2 entries=21 op=nft_register_rule pid=3677 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 01:20:11.929000 audit[3677]: SYSCALL arch=c000003e syscall=46 success=yes exit=8224 a0=3 a1=7ffc3d7e3a70 a2=0 a3=7ffc3d7e3a5c items=0 ppid=3378 pid=3677 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:11.937215 kernel: audit: type=1300 audit(1768440011.929:536): arch=c000003e syscall=46 success=yes exit=8224 a0=3 a1=7ffc3d7e3a70 a2=0 a3=7ffc3d7e3a5c items=0 ppid=3378 pid=3677 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:11.929000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 01:20:11.946049 kernel: audit: type=1327 audit(1768440011.929:536): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 01:20:11.941000 audit[3677]: NETFILTER_CFG table=nat:116 family=2 entries=12 op=nft_register_rule pid=3677 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 01:20:11.951026 kernel: audit: type=1325 audit(1768440011.941:537): table=nat:116 family=2 entries=12 op=nft_register_rule pid=3677 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 01:20:11.941000 audit[3677]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffc3d7e3a70 a2=0 a3=0 items=0 ppid=3378 pid=3677 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:11.941000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 01:20:11.957331 kernel: audit: type=1300 audit(1768440011.941:537): arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffc3d7e3a70 a2=0 a3=0 items=0 ppid=3378 pid=3677 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:11.957376 kernel: audit: type=1327 audit(1768440011.941:537): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 01:20:11.960440 containerd[1715]: time="2026-01-15T01:20:11.960406687Z" level=info msg="connecting to shim 58cedc76f5871646a6616b256dcb25cab9110d9829828b3a9a84dc5712f9f8b0" address="unix:///run/containerd/s/3e876776e29d85332bee007831df5098f60f1f33ef93f0d687f41ef7a1a47b42" namespace=k8s.io protocol=ttrpc version=3 Jan 15 01:20:11.996282 systemd[1]: Started cri-containerd-58cedc76f5871646a6616b256dcb25cab9110d9829828b3a9a84dc5712f9f8b0.scope - libcontainer container 58cedc76f5871646a6616b256dcb25cab9110d9829828b3a9a84dc5712f9f8b0. Jan 15 01:20:11.998068 kubelet[3275]: E0115 01:20:11.997430 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:11.998068 kubelet[3275]: W0115 01:20:11.997450 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:11.998068 kubelet[3275]: E0115 01:20:11.997478 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:11.998068 kubelet[3275]: E0115 01:20:11.998051 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:11.998068 kubelet[3275]: W0115 01:20:11.998062 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:11.998228 kubelet[3275]: E0115 01:20:11.998074 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:11.998945 kubelet[3275]: E0115 01:20:11.998540 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:11.998945 kubelet[3275]: W0115 01:20:11.998653 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:11.998945 kubelet[3275]: E0115 01:20:11.998665 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:11.999674 kubelet[3275]: E0115 01:20:11.999223 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:11.999674 kubelet[3275]: W0115 01:20:11.999234 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:11.999674 kubelet[3275]: E0115 01:20:11.999243 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:12.005721 kubelet[3275]: E0115 01:20:12.005686 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:12.005721 kubelet[3275]: W0115 01:20:12.005702 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:12.005721 kubelet[3275]: E0115 01:20:12.005715 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:12.005939 kubelet[3275]: E0115 01:20:12.005909 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:12.005939 kubelet[3275]: W0115 01:20:12.005919 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:12.005939 kubelet[3275]: E0115 01:20:12.005927 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:12.013293 kubelet[3275]: E0115 01:20:12.013271 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:12.013293 kubelet[3275]: W0115 01:20:12.013287 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:12.013293 kubelet[3275]: E0115 01:20:12.013302 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:12.021000 audit: BPF prog-id=151 op=LOAD Jan 15 01:20:12.024684 kernel: audit: type=1334 audit(1768440012.021:538): prog-id=151 op=LOAD Jan 15 01:20:12.023000 audit: BPF prog-id=152 op=LOAD Jan 15 01:20:12.023000 audit[3698]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000178238 a2=98 a3=0 items=0 ppid=3687 pid=3698 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:12.028452 kernel: audit: type=1334 audit(1768440012.023:539): prog-id=152 op=LOAD Jan 15 01:20:12.028499 kernel: audit: type=1300 audit(1768440012.023:539): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000178238 a2=98 a3=0 items=0 ppid=3687 pid=3698 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:12.023000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3538636564633736663538373136343661363631366232353664636232 Jan 15 01:20:12.033080 kernel: audit: type=1327 audit(1768440012.023:539): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3538636564633736663538373136343661363631366232353664636232 Jan 15 01:20:12.023000 audit: BPF prog-id=152 op=UNLOAD Jan 15 01:20:12.023000 audit[3698]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3687 pid=3698 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:12.023000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3538636564633736663538373136343661363631366232353664636232 Jan 15 01:20:12.023000 audit: BPF prog-id=153 op=LOAD Jan 15 01:20:12.023000 audit[3698]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000178488 a2=98 a3=0 items=0 ppid=3687 pid=3698 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:12.023000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3538636564633736663538373136343661363631366232353664636232 Jan 15 01:20:12.024000 audit: BPF prog-id=154 op=LOAD Jan 15 01:20:12.024000 audit[3698]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000178218 a2=98 a3=0 items=0 ppid=3687 pid=3698 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:12.024000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3538636564633736663538373136343661363631366232353664636232 Jan 15 01:20:12.024000 audit: BPF prog-id=154 op=UNLOAD Jan 15 01:20:12.024000 audit[3698]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3687 pid=3698 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:12.024000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3538636564633736663538373136343661363631366232353664636232 Jan 15 01:20:12.024000 audit: BPF prog-id=153 op=UNLOAD Jan 15 01:20:12.024000 audit[3698]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3687 pid=3698 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:12.024000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3538636564633736663538373136343661363631366232353664636232 Jan 15 01:20:12.024000 audit: BPF prog-id=155 op=LOAD Jan 15 01:20:12.024000 audit[3698]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001786e8 a2=98 a3=0 items=0 ppid=3687 pid=3698 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:12.024000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3538636564633736663538373136343661363631366232353664636232 Jan 15 01:20:12.082156 kubelet[3275]: E0115 01:20:12.082102 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-srjh9" podUID="ddb26c79-6272-4ee5-ba41-ad8ec552e6c6" Jan 15 01:20:12.089240 kubelet[3275]: E0115 01:20:12.089212 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:12.089347 kubelet[3275]: W0115 01:20:12.089231 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:12.089347 kubelet[3275]: E0115 01:20:12.089267 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:12.089693 kubelet[3275]: E0115 01:20:12.089640 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:12.089693 kubelet[3275]: W0115 01:20:12.089654 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:12.089693 kubelet[3275]: E0115 01:20:12.089664 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:12.090076 kubelet[3275]: E0115 01:20:12.089885 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:12.090076 kubelet[3275]: W0115 01:20:12.089891 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:12.090076 kubelet[3275]: E0115 01:20:12.089898 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:12.090980 kubelet[3275]: E0115 01:20:12.090571 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:12.090980 kubelet[3275]: W0115 01:20:12.090582 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:12.090980 kubelet[3275]: E0115 01:20:12.090591 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:12.090980 kubelet[3275]: E0115 01:20:12.090873 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:12.090980 kubelet[3275]: W0115 01:20:12.090879 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:12.090980 kubelet[3275]: E0115 01:20:12.090887 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:12.091178 kubelet[3275]: E0115 01:20:12.091148 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:12.091178 kubelet[3275]: W0115 01:20:12.091156 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:12.091178 kubelet[3275]: E0115 01:20:12.091163 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:12.092248 kubelet[3275]: E0115 01:20:12.091404 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:12.092248 kubelet[3275]: W0115 01:20:12.091413 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:12.092248 kubelet[3275]: E0115 01:20:12.091420 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:12.092248 kubelet[3275]: E0115 01:20:12.091566 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:12.092248 kubelet[3275]: W0115 01:20:12.091571 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:12.092248 kubelet[3275]: E0115 01:20:12.091578 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:12.092248 kubelet[3275]: E0115 01:20:12.091705 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:12.092248 kubelet[3275]: W0115 01:20:12.091710 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:12.092248 kubelet[3275]: E0115 01:20:12.091716 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:12.092248 kubelet[3275]: E0115 01:20:12.091824 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:12.092481 kubelet[3275]: W0115 01:20:12.091829 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:12.092481 kubelet[3275]: E0115 01:20:12.091834 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:12.092481 kubelet[3275]: E0115 01:20:12.091935 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:12.092481 kubelet[3275]: W0115 01:20:12.091940 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:12.092481 kubelet[3275]: E0115 01:20:12.091946 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:12.092481 kubelet[3275]: E0115 01:20:12.092078 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:12.092481 kubelet[3275]: W0115 01:20:12.092098 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:12.092481 kubelet[3275]: E0115 01:20:12.092104 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:12.092481 kubelet[3275]: E0115 01:20:12.092232 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:12.092481 kubelet[3275]: W0115 01:20:12.092238 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:12.092665 kubelet[3275]: E0115 01:20:12.092244 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:12.092665 kubelet[3275]: E0115 01:20:12.092353 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:12.092665 kubelet[3275]: W0115 01:20:12.092368 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:12.092665 kubelet[3275]: E0115 01:20:12.092373 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:12.092665 kubelet[3275]: E0115 01:20:12.092482 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:12.092665 kubelet[3275]: W0115 01:20:12.092487 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:12.092665 kubelet[3275]: E0115 01:20:12.092492 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:12.092665 kubelet[3275]: E0115 01:20:12.092596 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:12.092665 kubelet[3275]: W0115 01:20:12.092601 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:12.092665 kubelet[3275]: E0115 01:20:12.092607 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:12.092852 kubelet[3275]: E0115 01:20:12.092719 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:12.092852 kubelet[3275]: W0115 01:20:12.092734 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:12.092852 kubelet[3275]: E0115 01:20:12.092740 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:12.094950 kubelet[3275]: E0115 01:20:12.092985 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:12.094950 kubelet[3275]: W0115 01:20:12.093040 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:12.094950 kubelet[3275]: E0115 01:20:12.093050 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:12.094950 kubelet[3275]: E0115 01:20:12.093500 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:12.094950 kubelet[3275]: W0115 01:20:12.093508 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:12.094950 kubelet[3275]: E0115 01:20:12.093515 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:12.094950 kubelet[3275]: E0115 01:20:12.093639 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:12.094950 kubelet[3275]: W0115 01:20:12.093644 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:12.094950 kubelet[3275]: E0115 01:20:12.093650 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:12.094950 kubelet[3275]: E0115 01:20:12.094578 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:12.096371 kubelet[3275]: W0115 01:20:12.094595 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:12.096371 kubelet[3275]: E0115 01:20:12.094608 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:12.096371 kubelet[3275]: I0115 01:20:12.094632 3275 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ddb26c79-6272-4ee5-ba41-ad8ec552e6c6-kubelet-dir\") pod \"csi-node-driver-srjh9\" (UID: \"ddb26c79-6272-4ee5-ba41-ad8ec552e6c6\") " pod="calico-system/csi-node-driver-srjh9" Jan 15 01:20:12.096371 kubelet[3275]: E0115 01:20:12.096073 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:12.096371 kubelet[3275]: W0115 01:20:12.096097 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:12.096371 kubelet[3275]: E0115 01:20:12.096115 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:12.096371 kubelet[3275]: I0115 01:20:12.096135 3275 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ddb26c79-6272-4ee5-ba41-ad8ec552e6c6-socket-dir\") pod \"csi-node-driver-srjh9\" (UID: \"ddb26c79-6272-4ee5-ba41-ad8ec552e6c6\") " pod="calico-system/csi-node-driver-srjh9" Jan 15 01:20:12.096371 kubelet[3275]: E0115 01:20:12.096289 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:12.096532 kubelet[3275]: W0115 01:20:12.096301 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:12.096532 kubelet[3275]: E0115 01:20:12.096319 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:12.096532 kubelet[3275]: E0115 01:20:12.096465 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:12.096532 kubelet[3275]: W0115 01:20:12.096471 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:12.096532 kubelet[3275]: E0115 01:20:12.096479 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:12.096627 kubelet[3275]: E0115 01:20:12.096603 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:12.096627 kubelet[3275]: W0115 01:20:12.096610 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:12.096627 kubelet[3275]: E0115 01:20:12.096617 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:12.096681 kubelet[3275]: I0115 01:20:12.096634 3275 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ddb26c79-6272-4ee5-ba41-ad8ec552e6c6-registration-dir\") pod \"csi-node-driver-srjh9\" (UID: \"ddb26c79-6272-4ee5-ba41-ad8ec552e6c6\") " pod="calico-system/csi-node-driver-srjh9" Jan 15 01:20:12.097242 kubelet[3275]: E0115 01:20:12.097093 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:12.097242 kubelet[3275]: W0115 01:20:12.097105 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:12.097242 kubelet[3275]: E0115 01:20:12.097156 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:12.097311 kubelet[3275]: I0115 01:20:12.097290 3275 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lc26\" (UniqueName: \"kubernetes.io/projected/ddb26c79-6272-4ee5-ba41-ad8ec552e6c6-kube-api-access-6lc26\") pod \"csi-node-driver-srjh9\" (UID: \"ddb26c79-6272-4ee5-ba41-ad8ec552e6c6\") " pod="calico-system/csi-node-driver-srjh9" Jan 15 01:20:12.097497 kubelet[3275]: E0115 01:20:12.097343 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:12.097497 kubelet[3275]: W0115 01:20:12.097352 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:12.097497 kubelet[3275]: E0115 01:20:12.097362 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:12.097497 kubelet[3275]: E0115 01:20:12.097474 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:12.097497 kubelet[3275]: W0115 01:20:12.097479 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:12.097497 kubelet[3275]: E0115 01:20:12.097489 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:12.098266 kubelet[3275]: E0115 01:20:12.097598 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:12.098266 kubelet[3275]: W0115 01:20:12.097603 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:12.098266 kubelet[3275]: E0115 01:20:12.097616 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:12.098266 kubelet[3275]: E0115 01:20:12.098062 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:12.098266 kubelet[3275]: W0115 01:20:12.098070 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:12.098266 kubelet[3275]: E0115 01:20:12.098085 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:12.098266 kubelet[3275]: E0115 01:20:12.098243 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:12.098266 kubelet[3275]: W0115 01:20:12.098248 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:12.098266 kubelet[3275]: E0115 01:20:12.098262 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:12.098452 kubelet[3275]: I0115 01:20:12.098277 3275 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/ddb26c79-6272-4ee5-ba41-ad8ec552e6c6-varrun\") pod \"csi-node-driver-srjh9\" (UID: \"ddb26c79-6272-4ee5-ba41-ad8ec552e6c6\") " pod="calico-system/csi-node-driver-srjh9" Jan 15 01:20:12.099153 kubelet[3275]: E0115 01:20:12.099102 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:12.099153 kubelet[3275]: W0115 01:20:12.099113 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:12.099234 kubelet[3275]: E0115 01:20:12.099227 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:12.099573 kubelet[3275]: E0115 01:20:12.099269 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:12.099573 kubelet[3275]: W0115 01:20:12.099276 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:12.099573 kubelet[3275]: E0115 01:20:12.099284 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:12.099573 kubelet[3275]: E0115 01:20:12.099386 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:12.099573 kubelet[3275]: W0115 01:20:12.099391 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:12.099573 kubelet[3275]: E0115 01:20:12.099397 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:12.099993 kubelet[3275]: E0115 01:20:12.099981 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:12.099993 kubelet[3275]: W0115 01:20:12.099992 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:12.100051 kubelet[3275]: E0115 01:20:12.100001 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:12.147911 containerd[1715]: time="2026-01-15T01:20:12.146892916Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-764d546866-6z8pf,Uid:56ddef29-6fbd-4362-a6ea-2276a891264e,Namespace:calico-system,Attempt:0,} returns sandbox id \"58cedc76f5871646a6616b256dcb25cab9110d9829828b3a9a84dc5712f9f8b0\"" Jan 15 01:20:12.151892 containerd[1715]: time="2026-01-15T01:20:12.151867047Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\"" Jan 15 01:20:12.186241 containerd[1715]: time="2026-01-15T01:20:12.186204933Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-jjvh5,Uid:12aa595e-1fc9-4709-8018-8dbe9f9bda3a,Namespace:calico-system,Attempt:0,}" Jan 15 01:20:12.199610 kubelet[3275]: E0115 01:20:12.199567 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:12.199610 kubelet[3275]: W0115 01:20:12.199600 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:12.199826 kubelet[3275]: E0115 01:20:12.199618 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:12.200046 kubelet[3275]: E0115 01:20:12.200033 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:12.200046 kubelet[3275]: W0115 01:20:12.200043 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:12.200254 kubelet[3275]: E0115 01:20:12.200057 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:12.200407 kubelet[3275]: E0115 01:20:12.200394 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:12.200449 kubelet[3275]: W0115 01:20:12.200440 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:12.200520 kubelet[3275]: E0115 01:20:12.200510 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:12.200785 kubelet[3275]: E0115 01:20:12.200771 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:12.200785 kubelet[3275]: W0115 01:20:12.200781 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:12.200853 kubelet[3275]: E0115 01:20:12.200795 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:12.201161 kubelet[3275]: E0115 01:20:12.201058 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:12.201161 kubelet[3275]: W0115 01:20:12.201076 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:12.201161 kubelet[3275]: E0115 01:20:12.201091 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:12.201402 kubelet[3275]: E0115 01:20:12.201393 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:12.201781 kubelet[3275]: W0115 01:20:12.201453 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:12.201781 kubelet[3275]: E0115 01:20:12.201469 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:12.202208 kubelet[3275]: E0115 01:20:12.202120 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:12.202208 kubelet[3275]: W0115 01:20:12.202130 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:12.202208 kubelet[3275]: E0115 01:20:12.202150 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:12.202379 kubelet[3275]: E0115 01:20:12.202372 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:12.202424 kubelet[3275]: W0115 01:20:12.202412 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:12.202586 kubelet[3275]: E0115 01:20:12.202571 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:12.203025 kubelet[3275]: E0115 01:20:12.202746 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:12.203025 kubelet[3275]: W0115 01:20:12.202754 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:12.203025 kubelet[3275]: E0115 01:20:12.202771 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:12.203153 kubelet[3275]: E0115 01:20:12.203145 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:12.203189 kubelet[3275]: W0115 01:20:12.203183 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:12.203245 kubelet[3275]: E0115 01:20:12.203232 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:12.204030 kubelet[3275]: E0115 01:20:12.203409 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:12.204030 kubelet[3275]: W0115 01:20:12.203418 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:12.204030 kubelet[3275]: E0115 01:20:12.203431 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:12.204192 kubelet[3275]: E0115 01:20:12.204163 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:12.204229 kubelet[3275]: W0115 01:20:12.204222 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:12.204277 kubelet[3275]: E0115 01:20:12.204270 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:12.204548 kubelet[3275]: E0115 01:20:12.204480 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:12.204548 kubelet[3275]: W0115 01:20:12.204487 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:12.204548 kubelet[3275]: E0115 01:20:12.204505 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:12.204842 kubelet[3275]: E0115 01:20:12.204834 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:12.204885 kubelet[3275]: W0115 01:20:12.204879 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:12.204974 kubelet[3275]: E0115 01:20:12.204954 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:12.205240 kubelet[3275]: E0115 01:20:12.205166 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:12.205327 kubelet[3275]: W0115 01:20:12.205319 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:12.205801 kubelet[3275]: E0115 01:20:12.205785 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:12.206072 kubelet[3275]: E0115 01:20:12.205978 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:12.206072 kubelet[3275]: W0115 01:20:12.205987 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:12.206072 kubelet[3275]: E0115 01:20:12.206005 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:12.207186 kubelet[3275]: E0115 01:20:12.207094 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:12.207186 kubelet[3275]: W0115 01:20:12.207104 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:12.207186 kubelet[3275]: E0115 01:20:12.207118 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:12.207323 kubelet[3275]: E0115 01:20:12.207317 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:12.208052 kubelet[3275]: W0115 01:20:12.207352 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:12.208134 kubelet[3275]: E0115 01:20:12.208124 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:12.208438 kubelet[3275]: E0115 01:20:12.208339 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:12.208438 kubelet[3275]: W0115 01:20:12.208347 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:12.208438 kubelet[3275]: E0115 01:20:12.208354 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:12.208619 kubelet[3275]: E0115 01:20:12.208613 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:12.208654 kubelet[3275]: W0115 01:20:12.208649 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:12.208841 kubelet[3275]: E0115 01:20:12.208825 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:12.208967 kubelet[3275]: E0115 01:20:12.208897 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:12.208967 kubelet[3275]: W0115 01:20:12.208905 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:12.208967 kubelet[3275]: E0115 01:20:12.208923 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:12.209147 kubelet[3275]: E0115 01:20:12.209141 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:12.209289 kubelet[3275]: W0115 01:20:12.209184 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:12.209289 kubelet[3275]: E0115 01:20:12.209199 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:12.210089 kubelet[3275]: E0115 01:20:12.210080 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:12.210133 kubelet[3275]: W0115 01:20:12.210128 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:12.210172 kubelet[3275]: E0115 01:20:12.210166 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:12.210410 kubelet[3275]: E0115 01:20:12.210323 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:12.210410 kubelet[3275]: W0115 01:20:12.210330 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:12.210410 kubelet[3275]: E0115 01:20:12.210337 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:12.210516 kubelet[3275]: E0115 01:20:12.210511 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:12.210551 kubelet[3275]: W0115 01:20:12.210546 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:12.210586 kubelet[3275]: E0115 01:20:12.210580 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:12.231878 kubelet[3275]: E0115 01:20:12.231804 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:12.231878 kubelet[3275]: W0115 01:20:12.231824 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:12.231878 kubelet[3275]: E0115 01:20:12.231841 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:12.232224 containerd[1715]: time="2026-01-15T01:20:12.232195548Z" level=info msg="connecting to shim 3858336ede90dd0cd6290b3a3da203dbc97fdf8aa39eef530c11b26e312fc097" address="unix:///run/containerd/s/b33b4498050dda8cc0b5ddd166873b61b76fe206d972878c43a53da06ca4fa9d" namespace=k8s.io protocol=ttrpc version=3 Jan 15 01:20:12.262287 systemd[1]: Started cri-containerd-3858336ede90dd0cd6290b3a3da203dbc97fdf8aa39eef530c11b26e312fc097.scope - libcontainer container 3858336ede90dd0cd6290b3a3da203dbc97fdf8aa39eef530c11b26e312fc097. Jan 15 01:20:12.278000 audit: BPF prog-id=156 op=LOAD Jan 15 01:20:12.279000 audit: BPF prog-id=157 op=LOAD Jan 15 01:20:12.279000 audit[3826]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000106238 a2=98 a3=0 items=0 ppid=3813 pid=3826 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:12.279000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3338353833333665646539306464306364363239306233613364613230 Jan 15 01:20:12.279000 audit: BPF prog-id=157 op=UNLOAD Jan 15 01:20:12.279000 audit[3826]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3813 pid=3826 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:12.279000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3338353833333665646539306464306364363239306233613364613230 Jan 15 01:20:12.279000 audit: BPF prog-id=158 op=LOAD Jan 15 01:20:12.279000 audit[3826]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000106488 a2=98 a3=0 items=0 ppid=3813 pid=3826 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:12.279000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3338353833333665646539306464306364363239306233613364613230 Jan 15 01:20:12.279000 audit: BPF prog-id=159 op=LOAD Jan 15 01:20:12.279000 audit[3826]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000106218 a2=98 a3=0 items=0 ppid=3813 pid=3826 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:12.279000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3338353833333665646539306464306364363239306233613364613230 Jan 15 01:20:12.279000 audit: BPF prog-id=159 op=UNLOAD Jan 15 01:20:12.279000 audit[3826]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3813 pid=3826 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:12.279000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3338353833333665646539306464306364363239306233613364613230 Jan 15 01:20:12.279000 audit: BPF prog-id=158 op=UNLOAD Jan 15 01:20:12.279000 audit[3826]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3813 pid=3826 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:12.279000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3338353833333665646539306464306364363239306233613364613230 Jan 15 01:20:12.280000 audit: BPF prog-id=160 op=LOAD Jan 15 01:20:12.280000 audit[3826]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001066e8 a2=98 a3=0 items=0 ppid=3813 pid=3826 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:12.280000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3338353833333665646539306464306364363239306233613364613230 Jan 15 01:20:12.300861 containerd[1715]: time="2026-01-15T01:20:12.300810913Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-jjvh5,Uid:12aa595e-1fc9-4709-8018-8dbe9f9bda3a,Namespace:calico-system,Attempt:0,} returns sandbox id \"3858336ede90dd0cd6290b3a3da203dbc97fdf8aa39eef530c11b26e312fc097\"" Jan 15 01:20:13.655291 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1222708386.mount: Deactivated successfully. Jan 15 01:20:13.694692 kubelet[3275]: E0115 01:20:13.694202 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-srjh9" podUID="ddb26c79-6272-4ee5-ba41-ad8ec552e6c6" Jan 15 01:20:14.758803 containerd[1715]: time="2026-01-15T01:20:14.758763974Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 01:20:14.759907 containerd[1715]: time="2026-01-15T01:20:14.759871087Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.4: active requests=0, bytes read=33735893" Jan 15 01:20:14.762020 containerd[1715]: time="2026-01-15T01:20:14.761956649Z" level=info msg="ImageCreate event name:\"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 01:20:14.765680 containerd[1715]: time="2026-01-15T01:20:14.765618313Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 01:20:14.766128 containerd[1715]: time="2026-01-15T01:20:14.766092066Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.4\" with image id \"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\", size \"35234482\" in 2.613947932s" Jan 15 01:20:14.766262 containerd[1715]: time="2026-01-15T01:20:14.766196454Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\" returns image reference \"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\"" Jan 15 01:20:14.767556 containerd[1715]: time="2026-01-15T01:20:14.767533547Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\"" Jan 15 01:20:14.786857 containerd[1715]: time="2026-01-15T01:20:14.786824605Z" level=info msg="CreateContainer within sandbox \"58cedc76f5871646a6616b256dcb25cab9110d9829828b3a9a84dc5712f9f8b0\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Jan 15 01:20:14.798168 containerd[1715]: time="2026-01-15T01:20:14.798131515Z" level=info msg="Container 8652b53c4734ec25ca2b38a2b31ab1831784a212311c6cad2ed5542d7cf90f9d: CDI devices from CRI Config.CDIDevices: []" Jan 15 01:20:14.801283 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2142695059.mount: Deactivated successfully. Jan 15 01:20:14.808828 containerd[1715]: time="2026-01-15T01:20:14.808690843Z" level=info msg="CreateContainer within sandbox \"58cedc76f5871646a6616b256dcb25cab9110d9829828b3a9a84dc5712f9f8b0\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"8652b53c4734ec25ca2b38a2b31ab1831784a212311c6cad2ed5542d7cf90f9d\"" Jan 15 01:20:14.810287 containerd[1715]: time="2026-01-15T01:20:14.810259436Z" level=info msg="StartContainer for \"8652b53c4734ec25ca2b38a2b31ab1831784a212311c6cad2ed5542d7cf90f9d\"" Jan 15 01:20:14.812335 containerd[1715]: time="2026-01-15T01:20:14.812299760Z" level=info msg="connecting to shim 8652b53c4734ec25ca2b38a2b31ab1831784a212311c6cad2ed5542d7cf90f9d" address="unix:///run/containerd/s/3e876776e29d85332bee007831df5098f60f1f33ef93f0d687f41ef7a1a47b42" protocol=ttrpc version=3 Jan 15 01:20:14.836248 systemd[1]: Started cri-containerd-8652b53c4734ec25ca2b38a2b31ab1831784a212311c6cad2ed5542d7cf90f9d.scope - libcontainer container 8652b53c4734ec25ca2b38a2b31ab1831784a212311c6cad2ed5542d7cf90f9d. Jan 15 01:20:14.847000 audit: BPF prog-id=161 op=LOAD Jan 15 01:20:14.848000 audit: BPF prog-id=162 op=LOAD Jan 15 01:20:14.848000 audit[3862]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000128238 a2=98 a3=0 items=0 ppid=3687 pid=3862 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:14.848000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3836353262353363343733346563323563613262333861326233316162 Jan 15 01:20:14.848000 audit: BPF prog-id=162 op=UNLOAD Jan 15 01:20:14.848000 audit[3862]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3687 pid=3862 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:14.848000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3836353262353363343733346563323563613262333861326233316162 Jan 15 01:20:14.848000 audit: BPF prog-id=163 op=LOAD Jan 15 01:20:14.848000 audit[3862]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000128488 a2=98 a3=0 items=0 ppid=3687 pid=3862 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:14.848000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3836353262353363343733346563323563613262333861326233316162 Jan 15 01:20:14.848000 audit: BPF prog-id=164 op=LOAD Jan 15 01:20:14.848000 audit[3862]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000128218 a2=98 a3=0 items=0 ppid=3687 pid=3862 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:14.848000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3836353262353363343733346563323563613262333861326233316162 Jan 15 01:20:14.848000 audit: BPF prog-id=164 op=UNLOAD Jan 15 01:20:14.848000 audit[3862]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3687 pid=3862 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:14.848000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3836353262353363343733346563323563613262333861326233316162 Jan 15 01:20:14.848000 audit: BPF prog-id=163 op=UNLOAD Jan 15 01:20:14.848000 audit[3862]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3687 pid=3862 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:14.848000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3836353262353363343733346563323563613262333861326233316162 Jan 15 01:20:14.848000 audit: BPF prog-id=165 op=LOAD Jan 15 01:20:14.848000 audit[3862]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001286e8 a2=98 a3=0 items=0 ppid=3687 pid=3862 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:14.848000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3836353262353363343733346563323563613262333861326233316162 Jan 15 01:20:14.883862 containerd[1715]: time="2026-01-15T01:20:14.883831147Z" level=info msg="StartContainer for \"8652b53c4734ec25ca2b38a2b31ab1831784a212311c6cad2ed5542d7cf90f9d\" returns successfully" Jan 15 01:20:15.696654 kubelet[3275]: E0115 01:20:15.696556 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-srjh9" podUID="ddb26c79-6272-4ee5-ba41-ad8ec552e6c6" Jan 15 01:20:15.818358 kubelet[3275]: E0115 01:20:15.818256 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:15.818358 kubelet[3275]: W0115 01:20:15.818291 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:15.818358 kubelet[3275]: E0115 01:20:15.818312 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:15.818721 kubelet[3275]: E0115 01:20:15.818709 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:15.818792 kubelet[3275]: W0115 01:20:15.818755 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:15.818792 kubelet[3275]: E0115 01:20:15.818765 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:15.818997 kubelet[3275]: E0115 01:20:15.818984 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:15.819103 kubelet[3275]: W0115 01:20:15.819052 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:15.819103 kubelet[3275]: E0115 01:20:15.819063 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:15.819364 kubelet[3275]: E0115 01:20:15.819310 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:15.819364 kubelet[3275]: W0115 01:20:15.819322 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:15.819364 kubelet[3275]: E0115 01:20:15.819329 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:15.819597 kubelet[3275]: E0115 01:20:15.819554 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:15.819597 kubelet[3275]: W0115 01:20:15.819561 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:15.819597 kubelet[3275]: E0115 01:20:15.819568 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:15.819818 kubelet[3275]: E0115 01:20:15.819782 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:15.819818 kubelet[3275]: W0115 01:20:15.819788 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:15.819818 kubelet[3275]: E0115 01:20:15.819795 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:15.820097 kubelet[3275]: E0115 01:20:15.820046 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:15.820097 kubelet[3275]: W0115 01:20:15.820052 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:15.820097 kubelet[3275]: E0115 01:20:15.820059 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:15.820353 kubelet[3275]: E0115 01:20:15.820300 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:15.820353 kubelet[3275]: W0115 01:20:15.820307 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:15.820353 kubelet[3275]: E0115 01:20:15.820313 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:15.820572 kubelet[3275]: E0115 01:20:15.820533 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:15.820572 kubelet[3275]: W0115 01:20:15.820540 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:15.820572 kubelet[3275]: E0115 01:20:15.820546 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:15.820788 kubelet[3275]: E0115 01:20:15.820751 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:15.820788 kubelet[3275]: W0115 01:20:15.820758 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:15.820788 kubelet[3275]: E0115 01:20:15.820764 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:15.821025 kubelet[3275]: E0115 01:20:15.820973 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:15.821025 kubelet[3275]: W0115 01:20:15.820979 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:15.821025 kubelet[3275]: E0115 01:20:15.820985 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:15.821251 kubelet[3275]: E0115 01:20:15.821204 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:15.821251 kubelet[3275]: W0115 01:20:15.821210 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:15.821251 kubelet[3275]: E0115 01:20:15.821217 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:15.821465 kubelet[3275]: E0115 01:20:15.821426 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:15.821465 kubelet[3275]: W0115 01:20:15.821432 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:15.821465 kubelet[3275]: E0115 01:20:15.821439 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:15.821679 kubelet[3275]: E0115 01:20:15.821642 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:15.821679 kubelet[3275]: W0115 01:20:15.821648 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:15.821679 kubelet[3275]: E0115 01:20:15.821654 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:15.821906 kubelet[3275]: E0115 01:20:15.821856 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:15.821906 kubelet[3275]: W0115 01:20:15.821862 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:15.821906 kubelet[3275]: E0115 01:20:15.821869 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:15.828599 kubelet[3275]: E0115 01:20:15.828568 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:15.828794 kubelet[3275]: W0115 01:20:15.828708 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:15.828794 kubelet[3275]: E0115 01:20:15.828729 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:15.829060 kubelet[3275]: E0115 01:20:15.829041 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:15.829060 kubelet[3275]: W0115 01:20:15.829050 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:15.829157 kubelet[3275]: E0115 01:20:15.829129 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:15.829312 kubelet[3275]: E0115 01:20:15.829305 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:15.829394 kubelet[3275]: W0115 01:20:15.829345 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:15.829394 kubelet[3275]: E0115 01:20:15.829362 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:15.829581 kubelet[3275]: E0115 01:20:15.829566 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:15.829581 kubelet[3275]: W0115 01:20:15.829573 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:15.829686 kubelet[3275]: E0115 01:20:15.829634 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:15.829880 kubelet[3275]: E0115 01:20:15.829814 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:15.829880 kubelet[3275]: W0115 01:20:15.829821 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:15.829880 kubelet[3275]: E0115 01:20:15.829829 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:15.830116 kubelet[3275]: E0115 01:20:15.830100 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:15.830116 kubelet[3275]: W0115 01:20:15.830107 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:15.830222 kubelet[3275]: E0115 01:20:15.830172 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:15.830566 kubelet[3275]: E0115 01:20:15.830464 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:15.830566 kubelet[3275]: W0115 01:20:15.830471 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:15.830566 kubelet[3275]: E0115 01:20:15.830479 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:15.830700 kubelet[3275]: E0115 01:20:15.830694 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:15.830737 kubelet[3275]: W0115 01:20:15.830731 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:15.830846 kubelet[3275]: E0115 01:20:15.830829 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:15.830933 kubelet[3275]: E0115 01:20:15.830918 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:15.830933 kubelet[3275]: W0115 01:20:15.830925 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:15.831047 kubelet[3275]: E0115 01:20:15.831028 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:15.831181 kubelet[3275]: E0115 01:20:15.831176 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:15.831269 kubelet[3275]: W0115 01:20:15.831215 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:15.831269 kubelet[3275]: E0115 01:20:15.831230 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:15.831431 kubelet[3275]: E0115 01:20:15.831426 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:15.831470 kubelet[3275]: W0115 01:20:15.831461 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:15.831529 kubelet[3275]: E0115 01:20:15.831503 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:15.831686 kubelet[3275]: E0115 01:20:15.831672 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:15.831686 kubelet[3275]: W0115 01:20:15.831678 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:15.831781 kubelet[3275]: E0115 01:20:15.831737 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:15.831960 kubelet[3275]: E0115 01:20:15.831945 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:15.831960 kubelet[3275]: W0115 01:20:15.831952 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:15.832144 kubelet[3275]: E0115 01:20:15.832041 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:15.832478 kubelet[3275]: E0115 01:20:15.832467 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:15.832591 kubelet[3275]: W0115 01:20:15.832520 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:15.832591 kubelet[3275]: E0115 01:20:15.832531 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:15.832668 kubelet[3275]: E0115 01:20:15.832663 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:15.832779 kubelet[3275]: W0115 01:20:15.832700 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:15.832779 kubelet[3275]: E0115 01:20:15.832708 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:15.832898 kubelet[3275]: E0115 01:20:15.832862 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:15.832898 kubelet[3275]: W0115 01:20:15.832868 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:15.832898 kubelet[3275]: E0115 01:20:15.832874 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:15.833128 kubelet[3275]: E0115 01:20:15.833097 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:15.833128 kubelet[3275]: W0115 01:20:15.833104 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:15.833128 kubelet[3275]: E0115 01:20:15.833110 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:15.833422 kubelet[3275]: E0115 01:20:15.833394 3275 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 01:20:15.833422 kubelet[3275]: W0115 01:20:15.833401 3275 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 01:20:15.833422 kubelet[3275]: E0115 01:20:15.833408 3275 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 01:20:16.356946 containerd[1715]: time="2026-01-15T01:20:16.356522093Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 01:20:16.357882 containerd[1715]: time="2026-01-15T01:20:16.357859305Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4: active requests=0, bytes read=0" Jan 15 01:20:16.359177 containerd[1715]: time="2026-01-15T01:20:16.359143006Z" level=info msg="ImageCreate event name:\"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 01:20:16.364944 containerd[1715]: time="2026-01-15T01:20:16.364621844Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 01:20:16.365400 containerd[1715]: time="2026-01-15T01:20:16.364930177Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" with image id \"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\", size \"5941314\" in 1.597370085s" Jan 15 01:20:16.365461 containerd[1715]: time="2026-01-15T01:20:16.365452138Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" returns image reference \"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\"" Jan 15 01:20:16.369049 containerd[1715]: time="2026-01-15T01:20:16.369027878Z" level=info msg="CreateContainer within sandbox \"3858336ede90dd0cd6290b3a3da203dbc97fdf8aa39eef530c11b26e312fc097\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Jan 15 01:20:16.384252 containerd[1715]: time="2026-01-15T01:20:16.384199544Z" level=info msg="Container 0ca209400a84820cfecd9a7f1d1d5d7ff41066ff0bef8b67964df5bd638bfaff: CDI devices from CRI Config.CDIDevices: []" Jan 15 01:20:16.401461 containerd[1715]: time="2026-01-15T01:20:16.401350191Z" level=info msg="CreateContainer within sandbox \"3858336ede90dd0cd6290b3a3da203dbc97fdf8aa39eef530c11b26e312fc097\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"0ca209400a84820cfecd9a7f1d1d5d7ff41066ff0bef8b67964df5bd638bfaff\"" Jan 15 01:20:16.402222 containerd[1715]: time="2026-01-15T01:20:16.402040367Z" level=info msg="StartContainer for \"0ca209400a84820cfecd9a7f1d1d5d7ff41066ff0bef8b67964df5bd638bfaff\"" Jan 15 01:20:16.405240 containerd[1715]: time="2026-01-15T01:20:16.405213540Z" level=info msg="connecting to shim 0ca209400a84820cfecd9a7f1d1d5d7ff41066ff0bef8b67964df5bd638bfaff" address="unix:///run/containerd/s/b33b4498050dda8cc0b5ddd166873b61b76fe206d972878c43a53da06ca4fa9d" protocol=ttrpc version=3 Jan 15 01:20:16.436384 systemd[1]: Started cri-containerd-0ca209400a84820cfecd9a7f1d1d5d7ff41066ff0bef8b67964df5bd638bfaff.scope - libcontainer container 0ca209400a84820cfecd9a7f1d1d5d7ff41066ff0bef8b67964df5bd638bfaff. Jan 15 01:20:16.495000 audit: BPF prog-id=166 op=LOAD Jan 15 01:20:16.495000 audit[3938]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000178488 a2=98 a3=0 items=0 ppid=3813 pid=3938 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:16.495000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3063613230393430306138343832306366656364396137663164316435 Jan 15 01:20:16.497000 audit: BPF prog-id=167 op=LOAD Jan 15 01:20:16.497000 audit[3938]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000178218 a2=98 a3=0 items=0 ppid=3813 pid=3938 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:16.497000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3063613230393430306138343832306366656364396137663164316435 Jan 15 01:20:16.497000 audit: BPF prog-id=167 op=UNLOAD Jan 15 01:20:16.497000 audit[3938]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3813 pid=3938 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:16.497000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3063613230393430306138343832306366656364396137663164316435 Jan 15 01:20:16.497000 audit: BPF prog-id=166 op=UNLOAD Jan 15 01:20:16.497000 audit[3938]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3813 pid=3938 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:16.497000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3063613230393430306138343832306366656364396137663164316435 Jan 15 01:20:16.497000 audit: BPF prog-id=168 op=LOAD Jan 15 01:20:16.497000 audit[3938]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001786e8 a2=98 a3=0 items=0 ppid=3813 pid=3938 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:16.497000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3063613230393430306138343832306366656364396137663164316435 Jan 15 01:20:16.522163 containerd[1715]: time="2026-01-15T01:20:16.522125406Z" level=info msg="StartContainer for \"0ca209400a84820cfecd9a7f1d1d5d7ff41066ff0bef8b67964df5bd638bfaff\" returns successfully" Jan 15 01:20:16.529116 systemd[1]: cri-containerd-0ca209400a84820cfecd9a7f1d1d5d7ff41066ff0bef8b67964df5bd638bfaff.scope: Deactivated successfully. Jan 15 01:20:16.532499 containerd[1715]: time="2026-01-15T01:20:16.532466591Z" level=info msg="received container exit event container_id:\"0ca209400a84820cfecd9a7f1d1d5d7ff41066ff0bef8b67964df5bd638bfaff\" id:\"0ca209400a84820cfecd9a7f1d1d5d7ff41066ff0bef8b67964df5bd638bfaff\" pid:3950 exited_at:{seconds:1768440016 nanos:532051503}" Jan 15 01:20:16.534000 audit: BPF prog-id=168 op=UNLOAD Jan 15 01:20:16.557529 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-0ca209400a84820cfecd9a7f1d1d5d7ff41066ff0bef8b67964df5bd638bfaff-rootfs.mount: Deactivated successfully. Jan 15 01:20:16.768222 kubelet[3275]: I0115 01:20:16.768168 3275 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 15 01:20:16.785324 kubelet[3275]: I0115 01:20:16.784684 3275 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-764d546866-6z8pf" podStartSLOduration=3.169387218 podStartE2EDuration="5.784669055s" podCreationTimestamp="2026-01-15 01:20:11 +0000 UTC" firstStartedPulling="2026-01-15 01:20:12.151577405 +0000 UTC m=+18.575018026" lastFinishedPulling="2026-01-15 01:20:14.766859243 +0000 UTC m=+21.190299863" observedRunningTime="2026-01-15 01:20:15.782230188 +0000 UTC m=+22.205670809" watchObservedRunningTime="2026-01-15 01:20:16.784669055 +0000 UTC m=+23.208109693" Jan 15 01:20:17.691135 kubelet[3275]: E0115 01:20:17.691081 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-srjh9" podUID="ddb26c79-6272-4ee5-ba41-ad8ec552e6c6" Jan 15 01:20:18.774341 containerd[1715]: time="2026-01-15T01:20:18.774221261Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\"" Jan 15 01:20:19.691002 kubelet[3275]: E0115 01:20:19.690938 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-srjh9" podUID="ddb26c79-6272-4ee5-ba41-ad8ec552e6c6" Jan 15 01:20:21.692983 kubelet[3275]: E0115 01:20:21.692230 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-srjh9" podUID="ddb26c79-6272-4ee5-ba41-ad8ec552e6c6" Jan 15 01:20:22.665123 containerd[1715]: time="2026-01-15T01:20:22.665079583Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 01:20:22.667520 containerd[1715]: time="2026-01-15T01:20:22.667075322Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.4: active requests=0, bytes read=70442291" Jan 15 01:20:22.670096 containerd[1715]: time="2026-01-15T01:20:22.669444305Z" level=info msg="ImageCreate event name:\"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 01:20:22.671243 containerd[1715]: time="2026-01-15T01:20:22.670829716Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 01:20:22.674400 containerd[1715]: time="2026-01-15T01:20:22.671916984Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.4\" with image id \"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\", size \"71941459\" in 3.896840765s" Jan 15 01:20:22.674400 containerd[1715]: time="2026-01-15T01:20:22.671949005Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\" returns image reference \"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\"" Jan 15 01:20:22.676688 containerd[1715]: time="2026-01-15T01:20:22.676650756Z" level=info msg="CreateContainer within sandbox \"3858336ede90dd0cd6290b3a3da203dbc97fdf8aa39eef530c11b26e312fc097\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Jan 15 01:20:22.693028 containerd[1715]: time="2026-01-15T01:20:22.692715409Z" level=info msg="Container cdd606cbccbb62f7ef703328adfca06ad94e27c4a7043cfe4959431668392906: CDI devices from CRI Config.CDIDevices: []" Jan 15 01:20:22.709526 containerd[1715]: time="2026-01-15T01:20:22.709491515Z" level=info msg="CreateContainer within sandbox \"3858336ede90dd0cd6290b3a3da203dbc97fdf8aa39eef530c11b26e312fc097\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"cdd606cbccbb62f7ef703328adfca06ad94e27c4a7043cfe4959431668392906\"" Jan 15 01:20:22.710354 containerd[1715]: time="2026-01-15T01:20:22.710302190Z" level=info msg="StartContainer for \"cdd606cbccbb62f7ef703328adfca06ad94e27c4a7043cfe4959431668392906\"" Jan 15 01:20:22.712165 containerd[1715]: time="2026-01-15T01:20:22.712055180Z" level=info msg="connecting to shim cdd606cbccbb62f7ef703328adfca06ad94e27c4a7043cfe4959431668392906" address="unix:///run/containerd/s/b33b4498050dda8cc0b5ddd166873b61b76fe206d972878c43a53da06ca4fa9d" protocol=ttrpc version=3 Jan 15 01:20:22.735277 systemd[1]: Started cri-containerd-cdd606cbccbb62f7ef703328adfca06ad94e27c4a7043cfe4959431668392906.scope - libcontainer container cdd606cbccbb62f7ef703328adfca06ad94e27c4a7043cfe4959431668392906. Jan 15 01:20:22.776069 kernel: kauditd_printk_skb: 78 callbacks suppressed Jan 15 01:20:22.776186 kernel: audit: type=1334 audit(1768440022.772:568): prog-id=169 op=LOAD Jan 15 01:20:22.772000 audit: BPF prog-id=169 op=LOAD Jan 15 01:20:22.772000 audit[3997]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001a0488 a2=98 a3=0 items=0 ppid=3813 pid=3997 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:22.781039 kernel: audit: type=1300 audit(1768440022.772:568): arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001a0488 a2=98 a3=0 items=0 ppid=3813 pid=3997 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:22.781121 kernel: audit: type=1327 audit(1768440022.772:568): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6364643630366362636362623632663765663730333332386164666361 Jan 15 01:20:22.772000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6364643630366362636362623632663765663730333332386164666361 Jan 15 01:20:22.775000 audit: BPF prog-id=170 op=LOAD Jan 15 01:20:22.786063 kernel: audit: type=1334 audit(1768440022.775:569): prog-id=170 op=LOAD Jan 15 01:20:22.787029 kernel: audit: type=1300 audit(1768440022.775:569): arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c0001a0218 a2=98 a3=0 items=0 ppid=3813 pid=3997 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:22.775000 audit[3997]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c0001a0218 a2=98 a3=0 items=0 ppid=3813 pid=3997 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:22.775000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6364643630366362636362623632663765663730333332386164666361 Jan 15 01:20:22.793517 kernel: audit: type=1327 audit(1768440022.775:569): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6364643630366362636362623632663765663730333332386164666361 Jan 15 01:20:22.775000 audit: BPF prog-id=170 op=UNLOAD Jan 15 01:20:22.775000 audit[3997]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3813 pid=3997 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:22.799238 kernel: audit: type=1334 audit(1768440022.775:570): prog-id=170 op=UNLOAD Jan 15 01:20:22.799294 kernel: audit: type=1300 audit(1768440022.775:570): arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3813 pid=3997 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:22.775000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6364643630366362636362623632663765663730333332386164666361 Jan 15 01:20:22.803957 kernel: audit: type=1327 audit(1768440022.775:570): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6364643630366362636362623632663765663730333332386164666361 Jan 15 01:20:22.775000 audit: BPF prog-id=169 op=UNLOAD Jan 15 01:20:22.806544 kernel: audit: type=1334 audit(1768440022.775:571): prog-id=169 op=UNLOAD Jan 15 01:20:22.775000 audit[3997]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3813 pid=3997 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:22.775000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6364643630366362636362623632663765663730333332386164666361 Jan 15 01:20:22.775000 audit: BPF prog-id=171 op=LOAD Jan 15 01:20:22.775000 audit[3997]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001a06e8 a2=98 a3=0 items=0 ppid=3813 pid=3997 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:22.775000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6364643630366362636362623632663765663730333332386164666361 Jan 15 01:20:22.825952 containerd[1715]: time="2026-01-15T01:20:22.825882904Z" level=info msg="StartContainer for \"cdd606cbccbb62f7ef703328adfca06ad94e27c4a7043cfe4959431668392906\" returns successfully" Jan 15 01:20:23.693132 kubelet[3275]: E0115 01:20:23.692570 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-srjh9" podUID="ddb26c79-6272-4ee5-ba41-ad8ec552e6c6" Jan 15 01:20:24.127903 containerd[1715]: time="2026-01-15T01:20:24.127628929Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jan 15 01:20:24.130948 systemd[1]: cri-containerd-cdd606cbccbb62f7ef703328adfca06ad94e27c4a7043cfe4959431668392906.scope: Deactivated successfully. Jan 15 01:20:24.131463 systemd[1]: cri-containerd-cdd606cbccbb62f7ef703328adfca06ad94e27c4a7043cfe4959431668392906.scope: Consumed 454ms CPU time, 192.6M memory peak, 171.3M written to disk. Jan 15 01:20:24.133361 containerd[1715]: time="2026-01-15T01:20:24.133335210Z" level=info msg="received container exit event container_id:\"cdd606cbccbb62f7ef703328adfca06ad94e27c4a7043cfe4959431668392906\" id:\"cdd606cbccbb62f7ef703328adfca06ad94e27c4a7043cfe4959431668392906\" pid:4011 exited_at:{seconds:1768440024 nanos:133178666}" Jan 15 01:20:24.135000 audit: BPF prog-id=171 op=UNLOAD Jan 15 01:20:24.155753 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-cdd606cbccbb62f7ef703328adfca06ad94e27c4a7043cfe4959431668392906-rootfs.mount: Deactivated successfully. Jan 15 01:20:24.194599 kubelet[3275]: I0115 01:20:24.194569 3275 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Jan 15 01:20:24.242778 systemd[1]: Created slice kubepods-burstable-pod80ef1833_e881_4202_8a43_efb4cbd7eee4.slice - libcontainer container kubepods-burstable-pod80ef1833_e881_4202_8a43_efb4cbd7eee4.slice. Jan 15 01:20:24.254840 systemd[1]: Created slice kubepods-burstable-pod1740dd66_3e8a_4fbd_88e8_778f39eb7186.slice - libcontainer container kubepods-burstable-pod1740dd66_3e8a_4fbd_88e8_778f39eb7186.slice. Jan 15 01:20:24.264633 systemd[1]: Created slice kubepods-besteffort-pod0d14115d_26fb_4eac_a6b9_b5aa96406bb8.slice - libcontainer container kubepods-besteffort-pod0d14115d_26fb_4eac_a6b9_b5aa96406bb8.slice. Jan 15 01:20:24.270736 systemd[1]: Created slice kubepods-besteffort-podfc1c990f_1003_460d_a72d_34a2a5fb4d83.slice - libcontainer container kubepods-besteffort-podfc1c990f_1003_460d_a72d_34a2a5fb4d83.slice. Jan 15 01:20:24.278221 systemd[1]: Created slice kubepods-besteffort-pod19b9076e_57b5_41dc_b303_63eafd79e78c.slice - libcontainer container kubepods-besteffort-pod19b9076e_57b5_41dc_b303_63eafd79e78c.slice. Jan 15 01:20:24.283103 systemd[1]: Created slice kubepods-besteffort-podbcaf6132_cf28_4c4b_82a5_f573197628f9.slice - libcontainer container kubepods-besteffort-podbcaf6132_cf28_4c4b_82a5_f573197628f9.slice. Jan 15 01:20:24.287129 kubelet[3275]: I0115 01:20:24.287056 3275 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1740dd66-3e8a-4fbd-88e8-778f39eb7186-config-volume\") pod \"coredns-668d6bf9bc-2sh5x\" (UID: \"1740dd66-3e8a-4fbd-88e8-778f39eb7186\") " pod="kube-system/coredns-668d6bf9bc-2sh5x" Jan 15 01:20:24.288367 kubelet[3275]: I0115 01:20:24.287540 3275 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmtnw\" (UniqueName: \"kubernetes.io/projected/fc1c990f-1003-460d-a72d-34a2a5fb4d83-kube-api-access-mmtnw\") pod \"calico-apiserver-5b8cdf5dcc-grnmf\" (UID: \"fc1c990f-1003-460d-a72d-34a2a5fb4d83\") " pod="calico-apiserver/calico-apiserver-5b8cdf5dcc-grnmf" Jan 15 01:20:24.288367 kubelet[3275]: I0115 01:20:24.287565 3275 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bcaf6132-cf28-4c4b-82a5-f573197628f9-whisker-ca-bundle\") pod \"whisker-5c698fd89f-gfp6l\" (UID: \"bcaf6132-cf28-4c4b-82a5-f573197628f9\") " pod="calico-system/whisker-5c698fd89f-gfp6l" Jan 15 01:20:24.288367 kubelet[3275]: I0115 01:20:24.287581 3275 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/80ef1833-e881-4202-8a43-efb4cbd7eee4-config-volume\") pod \"coredns-668d6bf9bc-7zb8r\" (UID: \"80ef1833-e881-4202-8a43-efb4cbd7eee4\") " pod="kube-system/coredns-668d6bf9bc-7zb8r" Jan 15 01:20:24.288367 kubelet[3275]: I0115 01:20:24.287600 3275 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cv49f\" (UniqueName: \"kubernetes.io/projected/80ef1833-e881-4202-8a43-efb4cbd7eee4-kube-api-access-cv49f\") pod \"coredns-668d6bf9bc-7zb8r\" (UID: \"80ef1833-e881-4202-8a43-efb4cbd7eee4\") " pod="kube-system/coredns-668d6bf9bc-7zb8r" Jan 15 01:20:24.288367 kubelet[3275]: I0115 01:20:24.287618 3275 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxp5p\" (UniqueName: \"kubernetes.io/projected/0d14115d-26fb-4eac-a6b9-b5aa96406bb8-kube-api-access-hxp5p\") pod \"calico-kube-controllers-9c45d7c9c-l7rhq\" (UID: \"0d14115d-26fb-4eac-a6b9-b5aa96406bb8\") " pod="calico-system/calico-kube-controllers-9c45d7c9c-l7rhq" Jan 15 01:20:24.288571 kubelet[3275]: I0115 01:20:24.287634 3275 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/6ebcb69d-4b71-4631-9554-a5f179cc05ba-calico-apiserver-certs\") pod \"calico-apiserver-5b8cdf5dcc-9rs84\" (UID: \"6ebcb69d-4b71-4631-9554-a5f179cc05ba\") " pod="calico-apiserver/calico-apiserver-5b8cdf5dcc-9rs84" Jan 15 01:20:24.288571 kubelet[3275]: I0115 01:20:24.287648 3275 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxqh6\" (UniqueName: \"kubernetes.io/projected/19b9076e-57b5-41dc-b303-63eafd79e78c-kube-api-access-bxqh6\") pod \"goldmane-666569f655-qxn98\" (UID: \"19b9076e-57b5-41dc-b303-63eafd79e78c\") " pod="calico-system/goldmane-666569f655-qxn98" Jan 15 01:20:24.288571 kubelet[3275]: I0115 01:20:24.287663 3275 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/19b9076e-57b5-41dc-b303-63eafd79e78c-goldmane-key-pair\") pod \"goldmane-666569f655-qxn98\" (UID: \"19b9076e-57b5-41dc-b303-63eafd79e78c\") " pod="calico-system/goldmane-666569f655-qxn98" Jan 15 01:20:24.288571 kubelet[3275]: I0115 01:20:24.287678 3275 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/bcaf6132-cf28-4c4b-82a5-f573197628f9-whisker-backend-key-pair\") pod \"whisker-5c698fd89f-gfp6l\" (UID: \"bcaf6132-cf28-4c4b-82a5-f573197628f9\") " pod="calico-system/whisker-5c698fd89f-gfp6l" Jan 15 01:20:24.288571 kubelet[3275]: I0115 01:20:24.287697 3275 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d14115d-26fb-4eac-a6b9-b5aa96406bb8-tigera-ca-bundle\") pod \"calico-kube-controllers-9c45d7c9c-l7rhq\" (UID: \"0d14115d-26fb-4eac-a6b9-b5aa96406bb8\") " pod="calico-system/calico-kube-controllers-9c45d7c9c-l7rhq" Jan 15 01:20:24.288722 kubelet[3275]: I0115 01:20:24.287712 3275 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdrgt\" (UniqueName: \"kubernetes.io/projected/bcaf6132-cf28-4c4b-82a5-f573197628f9-kube-api-access-tdrgt\") pod \"whisker-5c698fd89f-gfp6l\" (UID: \"bcaf6132-cf28-4c4b-82a5-f573197628f9\") " pod="calico-system/whisker-5c698fd89f-gfp6l" Jan 15 01:20:24.288722 kubelet[3275]: I0115 01:20:24.287729 3275 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/fc1c990f-1003-460d-a72d-34a2a5fb4d83-calico-apiserver-certs\") pod \"calico-apiserver-5b8cdf5dcc-grnmf\" (UID: \"fc1c990f-1003-460d-a72d-34a2a5fb4d83\") " pod="calico-apiserver/calico-apiserver-5b8cdf5dcc-grnmf" Jan 15 01:20:24.288722 kubelet[3275]: I0115 01:20:24.287747 3275 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pf57\" (UniqueName: \"kubernetes.io/projected/1740dd66-3e8a-4fbd-88e8-778f39eb7186-kube-api-access-8pf57\") pod \"coredns-668d6bf9bc-2sh5x\" (UID: \"1740dd66-3e8a-4fbd-88e8-778f39eb7186\") " pod="kube-system/coredns-668d6bf9bc-2sh5x" Jan 15 01:20:24.288722 kubelet[3275]: I0115 01:20:24.287763 3275 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19b9076e-57b5-41dc-b303-63eafd79e78c-config\") pod \"goldmane-666569f655-qxn98\" (UID: \"19b9076e-57b5-41dc-b303-63eafd79e78c\") " pod="calico-system/goldmane-666569f655-qxn98" Jan 15 01:20:24.288722 kubelet[3275]: I0115 01:20:24.287783 3275 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/19b9076e-57b5-41dc-b303-63eafd79e78c-goldmane-ca-bundle\") pod \"goldmane-666569f655-qxn98\" (UID: \"19b9076e-57b5-41dc-b303-63eafd79e78c\") " pod="calico-system/goldmane-666569f655-qxn98" Jan 15 01:20:24.288828 kubelet[3275]: I0115 01:20:24.287802 3275 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcc99\" (UniqueName: \"kubernetes.io/projected/6ebcb69d-4b71-4631-9554-a5f179cc05ba-kube-api-access-kcc99\") pod \"calico-apiserver-5b8cdf5dcc-9rs84\" (UID: \"6ebcb69d-4b71-4631-9554-a5f179cc05ba\") " pod="calico-apiserver/calico-apiserver-5b8cdf5dcc-9rs84" Jan 15 01:20:24.291242 systemd[1]: Created slice kubepods-besteffort-pod6ebcb69d_4b71_4631_9554_a5f179cc05ba.slice - libcontainer container kubepods-besteffort-pod6ebcb69d_4b71_4631_9554_a5f179cc05ba.slice. Jan 15 01:20:25.015257 containerd[1715]: time="2026-01-15T01:20:25.013763804Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-7zb8r,Uid:80ef1833-e881-4202-8a43-efb4cbd7eee4,Namespace:kube-system,Attempt:0,}" Jan 15 01:20:25.016118 containerd[1715]: time="2026-01-15T01:20:25.016065825Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-qxn98,Uid:19b9076e-57b5-41dc-b303-63eafd79e78c,Namespace:calico-system,Attempt:0,}" Jan 15 01:20:25.016183 containerd[1715]: time="2026-01-15T01:20:25.016073197Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-2sh5x,Uid:1740dd66-3e8a-4fbd-88e8-778f39eb7186,Namespace:kube-system,Attempt:0,}" Jan 15 01:20:25.016616 containerd[1715]: time="2026-01-15T01:20:25.016464901Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-9c45d7c9c-l7rhq,Uid:0d14115d-26fb-4eac-a6b9-b5aa96406bb8,Namespace:calico-system,Attempt:0,}" Jan 15 01:20:25.016616 containerd[1715]: time="2026-01-15T01:20:25.016531317Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5b8cdf5dcc-grnmf,Uid:fc1c990f-1003-460d-a72d-34a2a5fb4d83,Namespace:calico-apiserver,Attempt:0,}" Jan 15 01:20:25.016616 containerd[1715]: time="2026-01-15T01:20:25.016591388Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5c698fd89f-gfp6l,Uid:bcaf6132-cf28-4c4b-82a5-f573197628f9,Namespace:calico-system,Attempt:0,}" Jan 15 01:20:25.016786 containerd[1715]: time="2026-01-15T01:20:25.016765723Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5b8cdf5dcc-9rs84,Uid:6ebcb69d-4b71-4631-9554-a5f179cc05ba,Namespace:calico-apiserver,Attempt:0,}" Jan 15 01:20:25.699975 systemd[1]: Created slice kubepods-besteffort-podddb26c79_6272_4ee5_ba41_ad8ec552e6c6.slice - libcontainer container kubepods-besteffort-podddb26c79_6272_4ee5_ba41_ad8ec552e6c6.slice. Jan 15 01:20:25.703535 containerd[1715]: time="2026-01-15T01:20:25.703051733Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-srjh9,Uid:ddb26c79-6272-4ee5-ba41-ad8ec552e6c6,Namespace:calico-system,Attempt:0,}" Jan 15 01:20:25.779798 containerd[1715]: time="2026-01-15T01:20:25.779752537Z" level=error msg="Failed to destroy network for sandbox \"37b94cd283c11f990baa8f9bffc14e9833915d01b2a0770dd198287ffd834c92\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 01:20:25.788790 containerd[1715]: time="2026-01-15T01:20:25.788076619Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-qxn98,Uid:19b9076e-57b5-41dc-b303-63eafd79e78c,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"37b94cd283c11f990baa8f9bffc14e9833915d01b2a0770dd198287ffd834c92\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 01:20:25.788963 kubelet[3275]: E0115 01:20:25.788295 3275 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"37b94cd283c11f990baa8f9bffc14e9833915d01b2a0770dd198287ffd834c92\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 01:20:25.788963 kubelet[3275]: E0115 01:20:25.788349 3275 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"37b94cd283c11f990baa8f9bffc14e9833915d01b2a0770dd198287ffd834c92\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-qxn98" Jan 15 01:20:25.788963 kubelet[3275]: E0115 01:20:25.788368 3275 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"37b94cd283c11f990baa8f9bffc14e9833915d01b2a0770dd198287ffd834c92\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-qxn98" Jan 15 01:20:25.789443 kubelet[3275]: E0115 01:20:25.788415 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-666569f655-qxn98_calico-system(19b9076e-57b5-41dc-b303-63eafd79e78c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-666569f655-qxn98_calico-system(19b9076e-57b5-41dc-b303-63eafd79e78c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"37b94cd283c11f990baa8f9bffc14e9833915d01b2a0770dd198287ffd834c92\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-666569f655-qxn98" podUID="19b9076e-57b5-41dc-b303-63eafd79e78c" Jan 15 01:20:25.808572 containerd[1715]: time="2026-01-15T01:20:25.808538874Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\"" Jan 15 01:20:25.828792 containerd[1715]: time="2026-01-15T01:20:25.828750083Z" level=error msg="Failed to destroy network for sandbox \"cce48c23e6bf9b4f431d9bd27326bb09d0c20d124a5d83ac71ab4b1d5d5ab360\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 01:20:25.839992 containerd[1715]: time="2026-01-15T01:20:25.839919357Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-2sh5x,Uid:1740dd66-3e8a-4fbd-88e8-778f39eb7186,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"cce48c23e6bf9b4f431d9bd27326bb09d0c20d124a5d83ac71ab4b1d5d5ab360\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 01:20:25.842042 kubelet[3275]: E0115 01:20:25.841002 3275 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cce48c23e6bf9b4f431d9bd27326bb09d0c20d124a5d83ac71ab4b1d5d5ab360\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 01:20:25.842042 kubelet[3275]: E0115 01:20:25.841064 3275 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cce48c23e6bf9b4f431d9bd27326bb09d0c20d124a5d83ac71ab4b1d5d5ab360\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-2sh5x" Jan 15 01:20:25.842042 kubelet[3275]: E0115 01:20:25.841085 3275 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cce48c23e6bf9b4f431d9bd27326bb09d0c20d124a5d83ac71ab4b1d5d5ab360\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-2sh5x" Jan 15 01:20:25.842194 kubelet[3275]: E0115 01:20:25.841123 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-2sh5x_kube-system(1740dd66-3e8a-4fbd-88e8-778f39eb7186)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-2sh5x_kube-system(1740dd66-3e8a-4fbd-88e8-778f39eb7186)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"cce48c23e6bf9b4f431d9bd27326bb09d0c20d124a5d83ac71ab4b1d5d5ab360\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-2sh5x" podUID="1740dd66-3e8a-4fbd-88e8-778f39eb7186" Jan 15 01:20:25.904855 containerd[1715]: time="2026-01-15T01:20:25.904733581Z" level=error msg="Failed to destroy network for sandbox \"3bcd2eff4022864f27745ac69f33531d5a28ade35775862b1428bf40a9af7347\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 01:20:25.907286 containerd[1715]: time="2026-01-15T01:20:25.907253376Z" level=error msg="Failed to destroy network for sandbox \"899af4935b4c9a354490d12ce2a739c93eef150c6a6d2faee094db09cf8d32fc\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 01:20:25.908282 containerd[1715]: time="2026-01-15T01:20:25.908246878Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-7zb8r,Uid:80ef1833-e881-4202-8a43-efb4cbd7eee4,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"3bcd2eff4022864f27745ac69f33531d5a28ade35775862b1428bf40a9af7347\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 01:20:25.908663 kubelet[3275]: E0115 01:20:25.908519 3275 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3bcd2eff4022864f27745ac69f33531d5a28ade35775862b1428bf40a9af7347\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 01:20:25.908663 kubelet[3275]: E0115 01:20:25.908571 3275 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3bcd2eff4022864f27745ac69f33531d5a28ade35775862b1428bf40a9af7347\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-7zb8r" Jan 15 01:20:25.908663 kubelet[3275]: E0115 01:20:25.908590 3275 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3bcd2eff4022864f27745ac69f33531d5a28ade35775862b1428bf40a9af7347\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-7zb8r" Jan 15 01:20:25.908766 kubelet[3275]: E0115 01:20:25.908631 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-7zb8r_kube-system(80ef1833-e881-4202-8a43-efb4cbd7eee4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-7zb8r_kube-system(80ef1833-e881-4202-8a43-efb4cbd7eee4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3bcd2eff4022864f27745ac69f33531d5a28ade35775862b1428bf40a9af7347\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-7zb8r" podUID="80ef1833-e881-4202-8a43-efb4cbd7eee4" Jan 15 01:20:25.910229 containerd[1715]: time="2026-01-15T01:20:25.910197383Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5c698fd89f-gfp6l,Uid:bcaf6132-cf28-4c4b-82a5-f573197628f9,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"899af4935b4c9a354490d12ce2a739c93eef150c6a6d2faee094db09cf8d32fc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 01:20:25.910837 kubelet[3275]: E0115 01:20:25.910722 3275 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"899af4935b4c9a354490d12ce2a739c93eef150c6a6d2faee094db09cf8d32fc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 01:20:25.910837 kubelet[3275]: E0115 01:20:25.910782 3275 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"899af4935b4c9a354490d12ce2a739c93eef150c6a6d2faee094db09cf8d32fc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-5c698fd89f-gfp6l" Jan 15 01:20:25.910837 kubelet[3275]: E0115 01:20:25.910798 3275 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"899af4935b4c9a354490d12ce2a739c93eef150c6a6d2faee094db09cf8d32fc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-5c698fd89f-gfp6l" Jan 15 01:20:25.910987 kubelet[3275]: E0115 01:20:25.910949 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-5c698fd89f-gfp6l_calico-system(bcaf6132-cf28-4c4b-82a5-f573197628f9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-5c698fd89f-gfp6l_calico-system(bcaf6132-cf28-4c4b-82a5-f573197628f9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"899af4935b4c9a354490d12ce2a739c93eef150c6a6d2faee094db09cf8d32fc\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-5c698fd89f-gfp6l" podUID="bcaf6132-cf28-4c4b-82a5-f573197628f9" Jan 15 01:20:25.914346 containerd[1715]: time="2026-01-15T01:20:25.914135245Z" level=error msg="Failed to destroy network for sandbox \"491a99eb4047e583c7edcb34af79c4af65a8b2c26689a447322cb6463ae2039a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 01:20:25.914537 containerd[1715]: time="2026-01-15T01:20:25.914511445Z" level=error msg="Failed to destroy network for sandbox \"be6cc684a1a5c0620321cf41adc66060b74f6ae726bef46b1e1bda4d6eb2a7ee\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 01:20:25.916604 containerd[1715]: time="2026-01-15T01:20:25.916575214Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-9c45d7c9c-l7rhq,Uid:0d14115d-26fb-4eac-a6b9-b5aa96406bb8,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"491a99eb4047e583c7edcb34af79c4af65a8b2c26689a447322cb6463ae2039a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 01:20:25.916782 kubelet[3275]: E0115 01:20:25.916708 3275 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"491a99eb4047e583c7edcb34af79c4af65a8b2c26689a447322cb6463ae2039a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 01:20:25.916824 kubelet[3275]: E0115 01:20:25.916744 3275 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"491a99eb4047e583c7edcb34af79c4af65a8b2c26689a447322cb6463ae2039a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-9c45d7c9c-l7rhq" Jan 15 01:20:25.916824 kubelet[3275]: E0115 01:20:25.916802 3275 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"491a99eb4047e583c7edcb34af79c4af65a8b2c26689a447322cb6463ae2039a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-9c45d7c9c-l7rhq" Jan 15 01:20:25.916968 kubelet[3275]: E0115 01:20:25.916834 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-9c45d7c9c-l7rhq_calico-system(0d14115d-26fb-4eac-a6b9-b5aa96406bb8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-9c45d7c9c-l7rhq_calico-system(0d14115d-26fb-4eac-a6b9-b5aa96406bb8)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"491a99eb4047e583c7edcb34af79c4af65a8b2c26689a447322cb6463ae2039a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-9c45d7c9c-l7rhq" podUID="0d14115d-26fb-4eac-a6b9-b5aa96406bb8" Jan 15 01:20:25.918618 containerd[1715]: time="2026-01-15T01:20:25.918588553Z" level=error msg="Failed to destroy network for sandbox \"6f8d34970dcc7c3e97bce3f195886f7493e9b74f3bde5768a2f18581e7a9669f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 01:20:25.918693 containerd[1715]: time="2026-01-15T01:20:25.918601502Z" level=error msg="Failed to destroy network for sandbox \"bf52ad095485f0cabeededadd91063a8b553458717e32ce04d9f3a666ecfda82\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 01:20:25.921052 containerd[1715]: time="2026-01-15T01:20:25.920962246Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5b8cdf5dcc-9rs84,Uid:6ebcb69d-4b71-4631-9554-a5f179cc05ba,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"6f8d34970dcc7c3e97bce3f195886f7493e9b74f3bde5768a2f18581e7a9669f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 01:20:25.921359 kubelet[3275]: E0115 01:20:25.921270 3275 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6f8d34970dcc7c3e97bce3f195886f7493e9b74f3bde5768a2f18581e7a9669f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 01:20:25.921432 kubelet[3275]: E0115 01:20:25.921372 3275 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6f8d34970dcc7c3e97bce3f195886f7493e9b74f3bde5768a2f18581e7a9669f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5b8cdf5dcc-9rs84" Jan 15 01:20:25.921432 kubelet[3275]: E0115 01:20:25.921389 3275 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6f8d34970dcc7c3e97bce3f195886f7493e9b74f3bde5768a2f18581e7a9669f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5b8cdf5dcc-9rs84" Jan 15 01:20:25.921823 kubelet[3275]: E0115 01:20:25.921799 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5b8cdf5dcc-9rs84_calico-apiserver(6ebcb69d-4b71-4631-9554-a5f179cc05ba)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5b8cdf5dcc-9rs84_calico-apiserver(6ebcb69d-4b71-4631-9554-a5f179cc05ba)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6f8d34970dcc7c3e97bce3f195886f7493e9b74f3bde5768a2f18581e7a9669f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5b8cdf5dcc-9rs84" podUID="6ebcb69d-4b71-4631-9554-a5f179cc05ba" Jan 15 01:20:25.923436 containerd[1715]: time="2026-01-15T01:20:25.923357798Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-srjh9,Uid:ddb26c79-6272-4ee5-ba41-ad8ec552e6c6,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"be6cc684a1a5c0620321cf41adc66060b74f6ae726bef46b1e1bda4d6eb2a7ee\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 01:20:25.923545 kubelet[3275]: E0115 01:20:25.923527 3275 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"be6cc684a1a5c0620321cf41adc66060b74f6ae726bef46b1e1bda4d6eb2a7ee\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 01:20:25.923603 kubelet[3275]: E0115 01:20:25.923589 3275 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"be6cc684a1a5c0620321cf41adc66060b74f6ae726bef46b1e1bda4d6eb2a7ee\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-srjh9" Jan 15 01:20:25.923646 kubelet[3275]: E0115 01:20:25.923614 3275 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"be6cc684a1a5c0620321cf41adc66060b74f6ae726bef46b1e1bda4d6eb2a7ee\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-srjh9" Jan 15 01:20:25.923680 kubelet[3275]: E0115 01:20:25.923665 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-srjh9_calico-system(ddb26c79-6272-4ee5-ba41-ad8ec552e6c6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-srjh9_calico-system(ddb26c79-6272-4ee5-ba41-ad8ec552e6c6)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"be6cc684a1a5c0620321cf41adc66060b74f6ae726bef46b1e1bda4d6eb2a7ee\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-srjh9" podUID="ddb26c79-6272-4ee5-ba41-ad8ec552e6c6" Jan 15 01:20:25.924104 containerd[1715]: time="2026-01-15T01:20:25.924072688Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5b8cdf5dcc-grnmf,Uid:fc1c990f-1003-460d-a72d-34a2a5fb4d83,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"bf52ad095485f0cabeededadd91063a8b553458717e32ce04d9f3a666ecfda82\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 01:20:25.924263 kubelet[3275]: E0115 01:20:25.924249 3275 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bf52ad095485f0cabeededadd91063a8b553458717e32ce04d9f3a666ecfda82\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 01:20:25.924302 kubelet[3275]: E0115 01:20:25.924269 3275 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bf52ad095485f0cabeededadd91063a8b553458717e32ce04d9f3a666ecfda82\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5b8cdf5dcc-grnmf" Jan 15 01:20:25.924302 kubelet[3275]: E0115 01:20:25.924283 3275 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bf52ad095485f0cabeededadd91063a8b553458717e32ce04d9f3a666ecfda82\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5b8cdf5dcc-grnmf" Jan 15 01:20:25.924346 kubelet[3275]: E0115 01:20:25.924306 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5b8cdf5dcc-grnmf_calico-apiserver(fc1c990f-1003-460d-a72d-34a2a5fb4d83)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5b8cdf5dcc-grnmf_calico-apiserver(fc1c990f-1003-460d-a72d-34a2a5fb4d83)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"bf52ad095485f0cabeededadd91063a8b553458717e32ce04d9f3a666ecfda82\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5b8cdf5dcc-grnmf" podUID="fc1c990f-1003-460d-a72d-34a2a5fb4d83" Jan 15 01:20:26.153566 systemd[1]: run-netns-cni\x2dd7a2de21\x2dde19\x2d265c\x2daa32\x2db73ff78fb6b3.mount: Deactivated successfully. Jan 15 01:20:26.153671 systemd[1]: run-netns-cni\x2d15f12410\x2d1d91\x2da4a8\x2df371\x2dce3868d7e8c2.mount: Deactivated successfully. Jan 15 01:20:26.153725 systemd[1]: run-netns-cni\x2d70e1c4be\x2dd24c\x2d4f67\x2d86ab\x2decda229185dd.mount: Deactivated successfully. Jan 15 01:20:26.153772 systemd[1]: run-netns-cni\x2de79e09bc\x2d65dd\x2df535\x2d7017\x2d3bfb89b4ebf7.mount: Deactivated successfully. Jan 15 01:20:33.768277 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3254923536.mount: Deactivated successfully. Jan 15 01:20:33.813675 containerd[1715]: time="2026-01-15T01:20:33.813556259Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 01:20:33.814595 containerd[1715]: time="2026-01-15T01:20:33.814543707Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.4: active requests=0, bytes read=156880025" Jan 15 01:20:33.815457 containerd[1715]: time="2026-01-15T01:20:33.815404993Z" level=info msg="ImageCreate event name:\"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 01:20:33.817585 containerd[1715]: time="2026-01-15T01:20:33.817539307Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 01:20:33.818065 containerd[1715]: time="2026-01-15T01:20:33.818045391Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.4\" with image id \"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\", size \"156883537\" in 8.009470895s" Jan 15 01:20:33.818193 containerd[1715]: time="2026-01-15T01:20:33.818113377Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\" returns image reference \"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\"" Jan 15 01:20:33.833630 containerd[1715]: time="2026-01-15T01:20:33.833596305Z" level=info msg="CreateContainer within sandbox \"3858336ede90dd0cd6290b3a3da203dbc97fdf8aa39eef530c11b26e312fc097\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Jan 15 01:20:33.863466 containerd[1715]: time="2026-01-15T01:20:33.863419410Z" level=info msg="Container 251a633342c9856debcc4b17d24c1ee7865a28e9062bbbd92b7776e2887078c6: CDI devices from CRI Config.CDIDevices: []" Jan 15 01:20:33.866860 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2899354288.mount: Deactivated successfully. Jan 15 01:20:33.880800 containerd[1715]: time="2026-01-15T01:20:33.880757650Z" level=info msg="CreateContainer within sandbox \"3858336ede90dd0cd6290b3a3da203dbc97fdf8aa39eef530c11b26e312fc097\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"251a633342c9856debcc4b17d24c1ee7865a28e9062bbbd92b7776e2887078c6\"" Jan 15 01:20:33.881328 containerd[1715]: time="2026-01-15T01:20:33.881309249Z" level=info msg="StartContainer for \"251a633342c9856debcc4b17d24c1ee7865a28e9062bbbd92b7776e2887078c6\"" Jan 15 01:20:33.883056 containerd[1715]: time="2026-01-15T01:20:33.882987721Z" level=info msg="connecting to shim 251a633342c9856debcc4b17d24c1ee7865a28e9062bbbd92b7776e2887078c6" address="unix:///run/containerd/s/b33b4498050dda8cc0b5ddd166873b61b76fe206d972878c43a53da06ca4fa9d" protocol=ttrpc version=3 Jan 15 01:20:33.933226 systemd[1]: Started cri-containerd-251a633342c9856debcc4b17d24c1ee7865a28e9062bbbd92b7776e2887078c6.scope - libcontainer container 251a633342c9856debcc4b17d24c1ee7865a28e9062bbbd92b7776e2887078c6. Jan 15 01:20:33.989000 audit: BPF prog-id=172 op=LOAD Jan 15 01:20:33.991489 kernel: kauditd_printk_skb: 6 callbacks suppressed Jan 15 01:20:33.991540 kernel: audit: type=1334 audit(1768440033.989:574): prog-id=172 op=LOAD Jan 15 01:20:33.989000 audit[4272]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001a0488 a2=98 a3=0 items=0 ppid=3813 pid=4272 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:33.995409 kernel: audit: type=1300 audit(1768440033.989:574): arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001a0488 a2=98 a3=0 items=0 ppid=3813 pid=4272 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:33.989000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3235316136333333343263393835366465626363346231376432346331 Jan 15 01:20:34.000294 kernel: audit: type=1327 audit(1768440033.989:574): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3235316136333333343263393835366465626363346231376432346331 Jan 15 01:20:33.992000 audit: BPF prog-id=173 op=LOAD Jan 15 01:20:34.003781 kernel: audit: type=1334 audit(1768440033.992:575): prog-id=173 op=LOAD Jan 15 01:20:33.992000 audit[4272]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c0001a0218 a2=98 a3=0 items=0 ppid=3813 pid=4272 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:34.006540 kernel: audit: type=1300 audit(1768440033.992:575): arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c0001a0218 a2=98 a3=0 items=0 ppid=3813 pid=4272 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:33.992000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3235316136333333343263393835366465626363346231376432346331 Jan 15 01:20:34.011641 kernel: audit: type=1327 audit(1768440033.992:575): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3235316136333333343263393835366465626363346231376432346331 Jan 15 01:20:33.992000 audit: BPF prog-id=173 op=UNLOAD Jan 15 01:20:33.992000 audit[4272]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3813 pid=4272 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:34.017835 kernel: audit: type=1334 audit(1768440033.992:576): prog-id=173 op=UNLOAD Jan 15 01:20:34.017884 kernel: audit: type=1300 audit(1768440033.992:576): arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3813 pid=4272 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:33.992000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3235316136333333343263393835366465626363346231376432346331 Jan 15 01:20:34.022856 kernel: audit: type=1327 audit(1768440033.992:576): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3235316136333333343263393835366465626363346231376432346331 Jan 15 01:20:33.992000 audit: BPF prog-id=172 op=UNLOAD Jan 15 01:20:34.025471 kernel: audit: type=1334 audit(1768440033.992:577): prog-id=172 op=UNLOAD Jan 15 01:20:33.992000 audit[4272]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3813 pid=4272 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:33.992000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3235316136333333343263393835366465626363346231376432346331 Jan 15 01:20:33.992000 audit: BPF prog-id=174 op=LOAD Jan 15 01:20:33.992000 audit[4272]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001a06e8 a2=98 a3=0 items=0 ppid=3813 pid=4272 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:33.992000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3235316136333333343263393835366465626363346231376432346331 Jan 15 01:20:34.038655 containerd[1715]: time="2026-01-15T01:20:34.038625789Z" level=info msg="StartContainer for \"251a633342c9856debcc4b17d24c1ee7865a28e9062bbbd92b7776e2887078c6\" returns successfully" Jan 15 01:20:34.125801 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Jan 15 01:20:34.125920 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Jan 15 01:20:34.386592 kubelet[3275]: I0115 01:20:34.386318 3275 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdrgt\" (UniqueName: \"kubernetes.io/projected/bcaf6132-cf28-4c4b-82a5-f573197628f9-kube-api-access-tdrgt\") pod \"bcaf6132-cf28-4c4b-82a5-f573197628f9\" (UID: \"bcaf6132-cf28-4c4b-82a5-f573197628f9\") " Jan 15 01:20:34.387053 kubelet[3275]: I0115 01:20:34.387030 3275 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/bcaf6132-cf28-4c4b-82a5-f573197628f9-whisker-backend-key-pair\") pod \"bcaf6132-cf28-4c4b-82a5-f573197628f9\" (UID: \"bcaf6132-cf28-4c4b-82a5-f573197628f9\") " Jan 15 01:20:34.387106 kubelet[3275]: I0115 01:20:34.387065 3275 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bcaf6132-cf28-4c4b-82a5-f573197628f9-whisker-ca-bundle\") pod \"bcaf6132-cf28-4c4b-82a5-f573197628f9\" (UID: \"bcaf6132-cf28-4c4b-82a5-f573197628f9\") " Jan 15 01:20:34.389115 kubelet[3275]: I0115 01:20:34.389086 3275 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bcaf6132-cf28-4c4b-82a5-f573197628f9-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "bcaf6132-cf28-4c4b-82a5-f573197628f9" (UID: "bcaf6132-cf28-4c4b-82a5-f573197628f9"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Jan 15 01:20:34.394884 kubelet[3275]: I0115 01:20:34.394845 3275 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcaf6132-cf28-4c4b-82a5-f573197628f9-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "bcaf6132-cf28-4c4b-82a5-f573197628f9" (UID: "bcaf6132-cf28-4c4b-82a5-f573197628f9"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Jan 15 01:20:34.395079 kubelet[3275]: I0115 01:20:34.395056 3275 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcaf6132-cf28-4c4b-82a5-f573197628f9-kube-api-access-tdrgt" (OuterVolumeSpecName: "kube-api-access-tdrgt") pod "bcaf6132-cf28-4c4b-82a5-f573197628f9" (UID: "bcaf6132-cf28-4c4b-82a5-f573197628f9"). InnerVolumeSpecName "kube-api-access-tdrgt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Jan 15 01:20:34.488165 kubelet[3275]: I0115 01:20:34.488119 3275 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/bcaf6132-cf28-4c4b-82a5-f573197628f9-whisker-backend-key-pair\") on node \"ci-4515-1-0-n-d76f075714\" DevicePath \"\"" Jan 15 01:20:34.488165 kubelet[3275]: I0115 01:20:34.488153 3275 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bcaf6132-cf28-4c4b-82a5-f573197628f9-whisker-ca-bundle\") on node \"ci-4515-1-0-n-d76f075714\" DevicePath \"\"" Jan 15 01:20:34.488165 kubelet[3275]: I0115 01:20:34.488163 3275 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tdrgt\" (UniqueName: \"kubernetes.io/projected/bcaf6132-cf28-4c4b-82a5-f573197628f9-kube-api-access-tdrgt\") on node \"ci-4515-1-0-n-d76f075714\" DevicePath \"\"" Jan 15 01:20:34.769473 systemd[1]: var-lib-kubelet-pods-bcaf6132\x2dcf28\x2d4c4b\x2d82a5\x2df573197628f9-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dtdrgt.mount: Deactivated successfully. Jan 15 01:20:34.769582 systemd[1]: var-lib-kubelet-pods-bcaf6132\x2dcf28\x2d4c4b\x2d82a5\x2df573197628f9-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Jan 15 01:20:34.838004 systemd[1]: Removed slice kubepods-besteffort-podbcaf6132_cf28_4c4b_82a5_f573197628f9.slice - libcontainer container kubepods-besteffort-podbcaf6132_cf28_4c4b_82a5_f573197628f9.slice. Jan 15 01:20:34.851263 kubelet[3275]: I0115 01:20:34.851215 3275 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-jjvh5" podStartSLOduration=2.334508268 podStartE2EDuration="23.851198961s" podCreationTimestamp="2026-01-15 01:20:11 +0000 UTC" firstStartedPulling="2026-01-15 01:20:12.302034758 +0000 UTC m=+18.725475379" lastFinishedPulling="2026-01-15 01:20:33.818725452 +0000 UTC m=+40.242166072" observedRunningTime="2026-01-15 01:20:34.849875018 +0000 UTC m=+41.273315662" watchObservedRunningTime="2026-01-15 01:20:34.851198961 +0000 UTC m=+41.274639583" Jan 15 01:20:34.907268 systemd[1]: Created slice kubepods-besteffort-pod20583031_73a3_4ec4_aced_e96f0a2ba67b.slice - libcontainer container kubepods-besteffort-pod20583031_73a3_4ec4_aced_e96f0a2ba67b.slice. Jan 15 01:20:34.991653 kubelet[3275]: I0115 01:20:34.991604 3275 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20583031-73a3-4ec4-aced-e96f0a2ba67b-whisker-ca-bundle\") pod \"whisker-7f47b45b95-r7r79\" (UID: \"20583031-73a3-4ec4-aced-e96f0a2ba67b\") " pod="calico-system/whisker-7f47b45b95-r7r79" Jan 15 01:20:34.991653 kubelet[3275]: I0115 01:20:34.991653 3275 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2z9kn\" (UniqueName: \"kubernetes.io/projected/20583031-73a3-4ec4-aced-e96f0a2ba67b-kube-api-access-2z9kn\") pod \"whisker-7f47b45b95-r7r79\" (UID: \"20583031-73a3-4ec4-aced-e96f0a2ba67b\") " pod="calico-system/whisker-7f47b45b95-r7r79" Jan 15 01:20:34.991821 kubelet[3275]: I0115 01:20:34.991673 3275 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/20583031-73a3-4ec4-aced-e96f0a2ba67b-whisker-backend-key-pair\") pod \"whisker-7f47b45b95-r7r79\" (UID: \"20583031-73a3-4ec4-aced-e96f0a2ba67b\") " pod="calico-system/whisker-7f47b45b95-r7r79" Jan 15 01:20:35.210575 containerd[1715]: time="2026-01-15T01:20:35.210527009Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7f47b45b95-r7r79,Uid:20583031-73a3-4ec4-aced-e96f0a2ba67b,Namespace:calico-system,Attempt:0,}" Jan 15 01:20:35.427628 systemd-networkd[1599]: cali9a34f9a0cff: Link UP Jan 15 01:20:35.428217 systemd-networkd[1599]: cali9a34f9a0cff: Gained carrier Jan 15 01:20:35.443429 containerd[1715]: 2026-01-15 01:20:35.249 [INFO][4339] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 15 01:20:35.443429 containerd[1715]: 2026-01-15 01:20:35.339 [INFO][4339] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4515--1--0--n--d76f075714-k8s-whisker--7f47b45b95--r7r79-eth0 whisker-7f47b45b95- calico-system 20583031-73a3-4ec4-aced-e96f0a2ba67b 868 0 2026-01-15 01:20:34 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:7f47b45b95 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4515-1-0-n-d76f075714 whisker-7f47b45b95-r7r79 eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali9a34f9a0cff [] [] }} ContainerID="c32dcf090c217802f420d7639824877060de5eba12d8fa6f4b5558aa66f84c21" Namespace="calico-system" Pod="whisker-7f47b45b95-r7r79" WorkloadEndpoint="ci--4515--1--0--n--d76f075714-k8s-whisker--7f47b45b95--r7r79-" Jan 15 01:20:35.443429 containerd[1715]: 2026-01-15 01:20:35.339 [INFO][4339] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c32dcf090c217802f420d7639824877060de5eba12d8fa6f4b5558aa66f84c21" Namespace="calico-system" Pod="whisker-7f47b45b95-r7r79" WorkloadEndpoint="ci--4515--1--0--n--d76f075714-k8s-whisker--7f47b45b95--r7r79-eth0" Jan 15 01:20:35.443429 containerd[1715]: 2026-01-15 01:20:35.374 [INFO][4350] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c32dcf090c217802f420d7639824877060de5eba12d8fa6f4b5558aa66f84c21" HandleID="k8s-pod-network.c32dcf090c217802f420d7639824877060de5eba12d8fa6f4b5558aa66f84c21" Workload="ci--4515--1--0--n--d76f075714-k8s-whisker--7f47b45b95--r7r79-eth0" Jan 15 01:20:35.443632 containerd[1715]: 2026-01-15 01:20:35.374 [INFO][4350] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="c32dcf090c217802f420d7639824877060de5eba12d8fa6f4b5558aa66f84c21" HandleID="k8s-pod-network.c32dcf090c217802f420d7639824877060de5eba12d8fa6f4b5558aa66f84c21" Workload="ci--4515--1--0--n--d76f075714-k8s-whisker--7f47b45b95--r7r79-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f0a0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4515-1-0-n-d76f075714", "pod":"whisker-7f47b45b95-r7r79", "timestamp":"2026-01-15 01:20:35.374124065 +0000 UTC"}, Hostname:"ci-4515-1-0-n-d76f075714", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 15 01:20:35.443632 containerd[1715]: 2026-01-15 01:20:35.374 [INFO][4350] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 15 01:20:35.443632 containerd[1715]: 2026-01-15 01:20:35.374 [INFO][4350] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 15 01:20:35.443632 containerd[1715]: 2026-01-15 01:20:35.374 [INFO][4350] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4515-1-0-n-d76f075714' Jan 15 01:20:35.443632 containerd[1715]: 2026-01-15 01:20:35.380 [INFO][4350] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.c32dcf090c217802f420d7639824877060de5eba12d8fa6f4b5558aa66f84c21" host="ci-4515-1-0-n-d76f075714" Jan 15 01:20:35.443632 containerd[1715]: 2026-01-15 01:20:35.385 [INFO][4350] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4515-1-0-n-d76f075714" Jan 15 01:20:35.443632 containerd[1715]: 2026-01-15 01:20:35.390 [INFO][4350] ipam/ipam.go 511: Trying affinity for 192.168.72.0/26 host="ci-4515-1-0-n-d76f075714" Jan 15 01:20:35.443632 containerd[1715]: 2026-01-15 01:20:35.392 [INFO][4350] ipam/ipam.go 158: Attempting to load block cidr=192.168.72.0/26 host="ci-4515-1-0-n-d76f075714" Jan 15 01:20:35.443632 containerd[1715]: 2026-01-15 01:20:35.394 [INFO][4350] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.72.0/26 host="ci-4515-1-0-n-d76f075714" Jan 15 01:20:35.443818 containerd[1715]: 2026-01-15 01:20:35.394 [INFO][4350] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.72.0/26 handle="k8s-pod-network.c32dcf090c217802f420d7639824877060de5eba12d8fa6f4b5558aa66f84c21" host="ci-4515-1-0-n-d76f075714" Jan 15 01:20:35.443818 containerd[1715]: 2026-01-15 01:20:35.396 [INFO][4350] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.c32dcf090c217802f420d7639824877060de5eba12d8fa6f4b5558aa66f84c21 Jan 15 01:20:35.443818 containerd[1715]: 2026-01-15 01:20:35.402 [INFO][4350] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.72.0/26 handle="k8s-pod-network.c32dcf090c217802f420d7639824877060de5eba12d8fa6f4b5558aa66f84c21" host="ci-4515-1-0-n-d76f075714" Jan 15 01:20:35.443818 containerd[1715]: 2026-01-15 01:20:35.409 [INFO][4350] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.72.1/26] block=192.168.72.0/26 handle="k8s-pod-network.c32dcf090c217802f420d7639824877060de5eba12d8fa6f4b5558aa66f84c21" host="ci-4515-1-0-n-d76f075714" Jan 15 01:20:35.443818 containerd[1715]: 2026-01-15 01:20:35.409 [INFO][4350] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.72.1/26] handle="k8s-pod-network.c32dcf090c217802f420d7639824877060de5eba12d8fa6f4b5558aa66f84c21" host="ci-4515-1-0-n-d76f075714" Jan 15 01:20:35.443818 containerd[1715]: 2026-01-15 01:20:35.410 [INFO][4350] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 15 01:20:35.443818 containerd[1715]: 2026-01-15 01:20:35.410 [INFO][4350] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.72.1/26] IPv6=[] ContainerID="c32dcf090c217802f420d7639824877060de5eba12d8fa6f4b5558aa66f84c21" HandleID="k8s-pod-network.c32dcf090c217802f420d7639824877060de5eba12d8fa6f4b5558aa66f84c21" Workload="ci--4515--1--0--n--d76f075714-k8s-whisker--7f47b45b95--r7r79-eth0" Jan 15 01:20:35.443971 containerd[1715]: 2026-01-15 01:20:35.412 [INFO][4339] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c32dcf090c217802f420d7639824877060de5eba12d8fa6f4b5558aa66f84c21" Namespace="calico-system" Pod="whisker-7f47b45b95-r7r79" WorkloadEndpoint="ci--4515--1--0--n--d76f075714-k8s-whisker--7f47b45b95--r7r79-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--n--d76f075714-k8s-whisker--7f47b45b95--r7r79-eth0", GenerateName:"whisker-7f47b45b95-", Namespace:"calico-system", SelfLink:"", UID:"20583031-73a3-4ec4-aced-e96f0a2ba67b", ResourceVersion:"868", Generation:0, CreationTimestamp:time.Date(2026, time.January, 15, 1, 20, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"7f47b45b95", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-n-d76f075714", ContainerID:"", Pod:"whisker-7f47b45b95-r7r79", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.72.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali9a34f9a0cff", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 15 01:20:35.443971 containerd[1715]: 2026-01-15 01:20:35.412 [INFO][4339] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.72.1/32] ContainerID="c32dcf090c217802f420d7639824877060de5eba12d8fa6f4b5558aa66f84c21" Namespace="calico-system" Pod="whisker-7f47b45b95-r7r79" WorkloadEndpoint="ci--4515--1--0--n--d76f075714-k8s-whisker--7f47b45b95--r7r79-eth0" Jan 15 01:20:35.444149 containerd[1715]: 2026-01-15 01:20:35.412 [INFO][4339] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali9a34f9a0cff ContainerID="c32dcf090c217802f420d7639824877060de5eba12d8fa6f4b5558aa66f84c21" Namespace="calico-system" Pod="whisker-7f47b45b95-r7r79" WorkloadEndpoint="ci--4515--1--0--n--d76f075714-k8s-whisker--7f47b45b95--r7r79-eth0" Jan 15 01:20:35.444149 containerd[1715]: 2026-01-15 01:20:35.429 [INFO][4339] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c32dcf090c217802f420d7639824877060de5eba12d8fa6f4b5558aa66f84c21" Namespace="calico-system" Pod="whisker-7f47b45b95-r7r79" WorkloadEndpoint="ci--4515--1--0--n--d76f075714-k8s-whisker--7f47b45b95--r7r79-eth0" Jan 15 01:20:35.444191 containerd[1715]: 2026-01-15 01:20:35.429 [INFO][4339] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c32dcf090c217802f420d7639824877060de5eba12d8fa6f4b5558aa66f84c21" Namespace="calico-system" Pod="whisker-7f47b45b95-r7r79" WorkloadEndpoint="ci--4515--1--0--n--d76f075714-k8s-whisker--7f47b45b95--r7r79-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--n--d76f075714-k8s-whisker--7f47b45b95--r7r79-eth0", GenerateName:"whisker-7f47b45b95-", Namespace:"calico-system", SelfLink:"", UID:"20583031-73a3-4ec4-aced-e96f0a2ba67b", ResourceVersion:"868", Generation:0, CreationTimestamp:time.Date(2026, time.January, 15, 1, 20, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"7f47b45b95", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-n-d76f075714", ContainerID:"c32dcf090c217802f420d7639824877060de5eba12d8fa6f4b5558aa66f84c21", Pod:"whisker-7f47b45b95-r7r79", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.72.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali9a34f9a0cff", MAC:"72:fb:f2:f2:ba:bc", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 15 01:20:35.444248 containerd[1715]: 2026-01-15 01:20:35.439 [INFO][4339] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c32dcf090c217802f420d7639824877060de5eba12d8fa6f4b5558aa66f84c21" Namespace="calico-system" Pod="whisker-7f47b45b95-r7r79" WorkloadEndpoint="ci--4515--1--0--n--d76f075714-k8s-whisker--7f47b45b95--r7r79-eth0" Jan 15 01:20:35.527591 containerd[1715]: time="2026-01-15T01:20:35.527495849Z" level=info msg="connecting to shim c32dcf090c217802f420d7639824877060de5eba12d8fa6f4b5558aa66f84c21" address="unix:///run/containerd/s/3ab09e8ad3391cd04a76babdaa4b05c95beffb04c975ae7d18e1c69364ac4dfe" namespace=k8s.io protocol=ttrpc version=3 Jan 15 01:20:35.562466 systemd[1]: Started cri-containerd-c32dcf090c217802f420d7639824877060de5eba12d8fa6f4b5558aa66f84c21.scope - libcontainer container c32dcf090c217802f420d7639824877060de5eba12d8fa6f4b5558aa66f84c21. Jan 15 01:20:35.582000 audit: BPF prog-id=175 op=LOAD Jan 15 01:20:35.584000 audit: BPF prog-id=176 op=LOAD Jan 15 01:20:35.584000 audit[4468]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=4456 pid=4468 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:35.584000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6333326463663039306332313738303266343230643736333938323438 Jan 15 01:20:35.584000 audit: BPF prog-id=176 op=UNLOAD Jan 15 01:20:35.584000 audit[4468]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4456 pid=4468 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:35.584000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6333326463663039306332313738303266343230643736333938323438 Jan 15 01:20:35.584000 audit: BPF prog-id=177 op=LOAD Jan 15 01:20:35.584000 audit[4468]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=4456 pid=4468 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:35.584000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6333326463663039306332313738303266343230643736333938323438 Jan 15 01:20:35.584000 audit: BPF prog-id=178 op=LOAD Jan 15 01:20:35.584000 audit[4468]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=4456 pid=4468 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:35.584000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6333326463663039306332313738303266343230643736333938323438 Jan 15 01:20:35.584000 audit: BPF prog-id=178 op=UNLOAD Jan 15 01:20:35.584000 audit[4468]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=4456 pid=4468 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:35.584000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6333326463663039306332313738303266343230643736333938323438 Jan 15 01:20:35.584000 audit: BPF prog-id=177 op=UNLOAD Jan 15 01:20:35.584000 audit[4468]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4456 pid=4468 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:35.584000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6333326463663039306332313738303266343230643736333938323438 Jan 15 01:20:35.584000 audit: BPF prog-id=179 op=LOAD Jan 15 01:20:35.584000 audit[4468]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=4456 pid=4468 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:35.584000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6333326463663039306332313738303266343230643736333938323438 Jan 15 01:20:35.648461 containerd[1715]: time="2026-01-15T01:20:35.648328233Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7f47b45b95-r7r79,Uid:20583031-73a3-4ec4-aced-e96f0a2ba67b,Namespace:calico-system,Attempt:0,} returns sandbox id \"c32dcf090c217802f420d7639824877060de5eba12d8fa6f4b5558aa66f84c21\"" Jan 15 01:20:35.652315 containerd[1715]: time="2026-01-15T01:20:35.652288964Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 15 01:20:35.693250 kubelet[3275]: I0115 01:20:35.693219 3275 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcaf6132-cf28-4c4b-82a5-f573197628f9" path="/var/lib/kubelet/pods/bcaf6132-cf28-4c4b-82a5-f573197628f9/volumes" Jan 15 01:20:35.988327 containerd[1715]: time="2026-01-15T01:20:35.988275742Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 01:20:35.989541 containerd[1715]: time="2026-01-15T01:20:35.989505409Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 15 01:20:35.989624 containerd[1715]: time="2026-01-15T01:20:35.989600405Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 15 01:20:35.989792 kubelet[3275]: E0115 01:20:35.989755 3275 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 15 01:20:35.989833 kubelet[3275]: E0115 01:20:35.989801 3275 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 15 01:20:35.994623 kubelet[3275]: E0115 01:20:35.994573 3275 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:8086b71984894a589b5b3e200fb6432c,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2z9kn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7f47b45b95-r7r79_calico-system(20583031-73a3-4ec4-aced-e96f0a2ba67b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 15 01:20:35.996740 containerd[1715]: time="2026-01-15T01:20:35.996714311Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 15 01:20:36.114590 kubelet[3275]: I0115 01:20:36.114494 3275 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 15 01:20:36.143000 audit[4500]: NETFILTER_CFG table=filter:117 family=2 entries=21 op=nft_register_rule pid=4500 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 01:20:36.143000 audit[4500]: SYSCALL arch=c000003e syscall=46 success=yes exit=7480 a0=3 a1=7ffcf721bc00 a2=0 a3=7ffcf721bbec items=0 ppid=3378 pid=4500 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:36.143000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 01:20:36.147000 audit[4500]: NETFILTER_CFG table=nat:118 family=2 entries=19 op=nft_register_chain pid=4500 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 01:20:36.147000 audit[4500]: SYSCALL arch=c000003e syscall=46 success=yes exit=6276 a0=3 a1=7ffcf721bc00 a2=0 a3=7ffcf721bbec items=0 ppid=3378 pid=4500 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:36.147000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 01:20:36.337705 containerd[1715]: time="2026-01-15T01:20:36.336692638Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 01:20:36.338560 containerd[1715]: time="2026-01-15T01:20:36.338509331Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 15 01:20:36.338612 containerd[1715]: time="2026-01-15T01:20:36.338602107Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 15 01:20:36.338804 kubelet[3275]: E0115 01:20:36.338763 3275 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 15 01:20:36.338867 kubelet[3275]: E0115 01:20:36.338805 3275 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 15 01:20:36.339101 kubelet[3275]: E0115 01:20:36.339058 3275 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2z9kn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7f47b45b95-r7r79_calico-system(20583031-73a3-4ec4-aced-e96f0a2ba67b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 15 01:20:36.340276 kubelet[3275]: E0115 01:20:36.340214 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7f47b45b95-r7r79" podUID="20583031-73a3-4ec4-aced-e96f0a2ba67b" Jan 15 01:20:36.691932 containerd[1715]: time="2026-01-15T01:20:36.691873969Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-srjh9,Uid:ddb26c79-6272-4ee5-ba41-ad8ec552e6c6,Namespace:calico-system,Attempt:0,}" Jan 15 01:20:36.736472 systemd-networkd[1599]: cali9a34f9a0cff: Gained IPv6LL Jan 15 01:20:36.831499 systemd-networkd[1599]: caliab882494aee: Link UP Jan 15 01:20:36.831720 systemd-networkd[1599]: caliab882494aee: Gained carrier Jan 15 01:20:36.842496 kubelet[3275]: E0115 01:20:36.842456 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7f47b45b95-r7r79" podUID="20583031-73a3-4ec4-aced-e96f0a2ba67b" Jan 15 01:20:36.852125 containerd[1715]: 2026-01-15 01:20:36.721 [INFO][4504] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 15 01:20:36.852125 containerd[1715]: 2026-01-15 01:20:36.739 [INFO][4504] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4515--1--0--n--d76f075714-k8s-csi--node--driver--srjh9-eth0 csi-node-driver- calico-system ddb26c79-6272-4ee5-ba41-ad8ec552e6c6 685 0 2026-01-15 01:20:12 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:857b56db8f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4515-1-0-n-d76f075714 csi-node-driver-srjh9 eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] caliab882494aee [] [] }} ContainerID="a92b70de798c42287cf1b029e9db65123cf942cb0e3a9a9b62fd427fdad24a4e" Namespace="calico-system" Pod="csi-node-driver-srjh9" WorkloadEndpoint="ci--4515--1--0--n--d76f075714-k8s-csi--node--driver--srjh9-" Jan 15 01:20:36.852125 containerd[1715]: 2026-01-15 01:20:36.739 [INFO][4504] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="a92b70de798c42287cf1b029e9db65123cf942cb0e3a9a9b62fd427fdad24a4e" Namespace="calico-system" Pod="csi-node-driver-srjh9" WorkloadEndpoint="ci--4515--1--0--n--d76f075714-k8s-csi--node--driver--srjh9-eth0" Jan 15 01:20:36.852125 containerd[1715]: 2026-01-15 01:20:36.776 [INFO][4517] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a92b70de798c42287cf1b029e9db65123cf942cb0e3a9a9b62fd427fdad24a4e" HandleID="k8s-pod-network.a92b70de798c42287cf1b029e9db65123cf942cb0e3a9a9b62fd427fdad24a4e" Workload="ci--4515--1--0--n--d76f075714-k8s-csi--node--driver--srjh9-eth0" Jan 15 01:20:36.853206 containerd[1715]: 2026-01-15 01:20:36.776 [INFO][4517] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="a92b70de798c42287cf1b029e9db65123cf942cb0e3a9a9b62fd427fdad24a4e" HandleID="k8s-pod-network.a92b70de798c42287cf1b029e9db65123cf942cb0e3a9a9b62fd427fdad24a4e" Workload="ci--4515--1--0--n--d76f075714-k8s-csi--node--driver--srjh9-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d55a0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4515-1-0-n-d76f075714", "pod":"csi-node-driver-srjh9", "timestamp":"2026-01-15 01:20:36.776652117 +0000 UTC"}, Hostname:"ci-4515-1-0-n-d76f075714", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 15 01:20:36.853206 containerd[1715]: 2026-01-15 01:20:36.777 [INFO][4517] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 15 01:20:36.853206 containerd[1715]: 2026-01-15 01:20:36.777 [INFO][4517] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 15 01:20:36.853206 containerd[1715]: 2026-01-15 01:20:36.777 [INFO][4517] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4515-1-0-n-d76f075714' Jan 15 01:20:36.853206 containerd[1715]: 2026-01-15 01:20:36.785 [INFO][4517] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.a92b70de798c42287cf1b029e9db65123cf942cb0e3a9a9b62fd427fdad24a4e" host="ci-4515-1-0-n-d76f075714" Jan 15 01:20:36.853206 containerd[1715]: 2026-01-15 01:20:36.791 [INFO][4517] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4515-1-0-n-d76f075714" Jan 15 01:20:36.853206 containerd[1715]: 2026-01-15 01:20:36.795 [INFO][4517] ipam/ipam.go 511: Trying affinity for 192.168.72.0/26 host="ci-4515-1-0-n-d76f075714" Jan 15 01:20:36.853206 containerd[1715]: 2026-01-15 01:20:36.797 [INFO][4517] ipam/ipam.go 158: Attempting to load block cidr=192.168.72.0/26 host="ci-4515-1-0-n-d76f075714" Jan 15 01:20:36.853206 containerd[1715]: 2026-01-15 01:20:36.801 [INFO][4517] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.72.0/26 host="ci-4515-1-0-n-d76f075714" Jan 15 01:20:36.853427 containerd[1715]: 2026-01-15 01:20:36.801 [INFO][4517] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.72.0/26 handle="k8s-pod-network.a92b70de798c42287cf1b029e9db65123cf942cb0e3a9a9b62fd427fdad24a4e" host="ci-4515-1-0-n-d76f075714" Jan 15 01:20:36.853427 containerd[1715]: 2026-01-15 01:20:36.803 [INFO][4517] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.a92b70de798c42287cf1b029e9db65123cf942cb0e3a9a9b62fd427fdad24a4e Jan 15 01:20:36.853427 containerd[1715]: 2026-01-15 01:20:36.808 [INFO][4517] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.72.0/26 handle="k8s-pod-network.a92b70de798c42287cf1b029e9db65123cf942cb0e3a9a9b62fd427fdad24a4e" host="ci-4515-1-0-n-d76f075714" Jan 15 01:20:36.853427 containerd[1715]: 2026-01-15 01:20:36.824 [INFO][4517] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.72.2/26] block=192.168.72.0/26 handle="k8s-pod-network.a92b70de798c42287cf1b029e9db65123cf942cb0e3a9a9b62fd427fdad24a4e" host="ci-4515-1-0-n-d76f075714" Jan 15 01:20:36.853427 containerd[1715]: 2026-01-15 01:20:36.824 [INFO][4517] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.72.2/26] handle="k8s-pod-network.a92b70de798c42287cf1b029e9db65123cf942cb0e3a9a9b62fd427fdad24a4e" host="ci-4515-1-0-n-d76f075714" Jan 15 01:20:36.853427 containerd[1715]: 2026-01-15 01:20:36.824 [INFO][4517] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 15 01:20:36.853427 containerd[1715]: 2026-01-15 01:20:36.824 [INFO][4517] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.72.2/26] IPv6=[] ContainerID="a92b70de798c42287cf1b029e9db65123cf942cb0e3a9a9b62fd427fdad24a4e" HandleID="k8s-pod-network.a92b70de798c42287cf1b029e9db65123cf942cb0e3a9a9b62fd427fdad24a4e" Workload="ci--4515--1--0--n--d76f075714-k8s-csi--node--driver--srjh9-eth0" Jan 15 01:20:36.853571 containerd[1715]: 2026-01-15 01:20:36.827 [INFO][4504] cni-plugin/k8s.go 418: Populated endpoint ContainerID="a92b70de798c42287cf1b029e9db65123cf942cb0e3a9a9b62fd427fdad24a4e" Namespace="calico-system" Pod="csi-node-driver-srjh9" WorkloadEndpoint="ci--4515--1--0--n--d76f075714-k8s-csi--node--driver--srjh9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--n--d76f075714-k8s-csi--node--driver--srjh9-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"ddb26c79-6272-4ee5-ba41-ad8ec552e6c6", ResourceVersion:"685", Generation:0, CreationTimestamp:time.Date(2026, time.January, 15, 1, 20, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-n-d76f075714", ContainerID:"", Pod:"csi-node-driver-srjh9", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.72.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"caliab882494aee", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 15 01:20:36.853628 containerd[1715]: 2026-01-15 01:20:36.828 [INFO][4504] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.72.2/32] ContainerID="a92b70de798c42287cf1b029e9db65123cf942cb0e3a9a9b62fd427fdad24a4e" Namespace="calico-system" Pod="csi-node-driver-srjh9" WorkloadEndpoint="ci--4515--1--0--n--d76f075714-k8s-csi--node--driver--srjh9-eth0" Jan 15 01:20:36.853628 containerd[1715]: 2026-01-15 01:20:36.828 [INFO][4504] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliab882494aee ContainerID="a92b70de798c42287cf1b029e9db65123cf942cb0e3a9a9b62fd427fdad24a4e" Namespace="calico-system" Pod="csi-node-driver-srjh9" WorkloadEndpoint="ci--4515--1--0--n--d76f075714-k8s-csi--node--driver--srjh9-eth0" Jan 15 01:20:36.853628 containerd[1715]: 2026-01-15 01:20:36.830 [INFO][4504] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a92b70de798c42287cf1b029e9db65123cf942cb0e3a9a9b62fd427fdad24a4e" Namespace="calico-system" Pod="csi-node-driver-srjh9" WorkloadEndpoint="ci--4515--1--0--n--d76f075714-k8s-csi--node--driver--srjh9-eth0" Jan 15 01:20:36.853687 containerd[1715]: 2026-01-15 01:20:36.830 [INFO][4504] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="a92b70de798c42287cf1b029e9db65123cf942cb0e3a9a9b62fd427fdad24a4e" Namespace="calico-system" Pod="csi-node-driver-srjh9" WorkloadEndpoint="ci--4515--1--0--n--d76f075714-k8s-csi--node--driver--srjh9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--n--d76f075714-k8s-csi--node--driver--srjh9-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"ddb26c79-6272-4ee5-ba41-ad8ec552e6c6", ResourceVersion:"685", Generation:0, CreationTimestamp:time.Date(2026, time.January, 15, 1, 20, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-n-d76f075714", ContainerID:"a92b70de798c42287cf1b029e9db65123cf942cb0e3a9a9b62fd427fdad24a4e", Pod:"csi-node-driver-srjh9", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.72.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"caliab882494aee", MAC:"5e:1c:f4:a3:65:11", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 15 01:20:36.853737 containerd[1715]: 2026-01-15 01:20:36.848 [INFO][4504] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="a92b70de798c42287cf1b029e9db65123cf942cb0e3a9a9b62fd427fdad24a4e" Namespace="calico-system" Pod="csi-node-driver-srjh9" WorkloadEndpoint="ci--4515--1--0--n--d76f075714-k8s-csi--node--driver--srjh9-eth0" Jan 15 01:20:36.880117 containerd[1715]: time="2026-01-15T01:20:36.879987178Z" level=info msg="connecting to shim a92b70de798c42287cf1b029e9db65123cf942cb0e3a9a9b62fd427fdad24a4e" address="unix:///run/containerd/s/20737f3278846f6c00f35782131ae0ef3f63bd7cf996eb2fb2fbb30d16686acb" namespace=k8s.io protocol=ttrpc version=3 Jan 15 01:20:36.909000 audit[4580]: NETFILTER_CFG table=filter:119 family=2 entries=20 op=nft_register_rule pid=4580 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 01:20:36.913197 systemd[1]: Started cri-containerd-a92b70de798c42287cf1b029e9db65123cf942cb0e3a9a9b62fd427fdad24a4e.scope - libcontainer container a92b70de798c42287cf1b029e9db65123cf942cb0e3a9a9b62fd427fdad24a4e. Jan 15 01:20:36.909000 audit[4580]: SYSCALL arch=c000003e syscall=46 success=yes exit=7480 a0=3 a1=7fff9bd0d740 a2=0 a3=7fff9bd0d72c items=0 ppid=3378 pid=4580 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:36.909000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 01:20:36.915000 audit[4580]: NETFILTER_CFG table=nat:120 family=2 entries=14 op=nft_register_rule pid=4580 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 01:20:36.915000 audit[4580]: SYSCALL arch=c000003e syscall=46 success=yes exit=3468 a0=3 a1=7fff9bd0d740 a2=0 a3=0 items=0 ppid=3378 pid=4580 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:36.915000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 01:20:36.944000 audit: BPF prog-id=180 op=LOAD Jan 15 01:20:36.944000 audit: BPF prog-id=181 op=LOAD Jan 15 01:20:36.944000 audit[4567]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=4557 pid=4567 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:36.944000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6139326237306465373938633432323837636631623032396539646236 Jan 15 01:20:36.944000 audit: BPF prog-id=181 op=UNLOAD Jan 15 01:20:36.944000 audit[4567]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4557 pid=4567 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:36.944000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6139326237306465373938633432323837636631623032396539646236 Jan 15 01:20:36.944000 audit: BPF prog-id=182 op=LOAD Jan 15 01:20:36.944000 audit[4567]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=4557 pid=4567 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:36.944000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6139326237306465373938633432323837636631623032396539646236 Jan 15 01:20:36.944000 audit: BPF prog-id=183 op=LOAD Jan 15 01:20:36.944000 audit[4567]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=4557 pid=4567 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:36.944000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6139326237306465373938633432323837636631623032396539646236 Jan 15 01:20:36.944000 audit: BPF prog-id=183 op=UNLOAD Jan 15 01:20:36.944000 audit[4567]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4557 pid=4567 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:36.944000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6139326237306465373938633432323837636631623032396539646236 Jan 15 01:20:36.944000 audit: BPF prog-id=182 op=UNLOAD Jan 15 01:20:36.944000 audit[4567]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4557 pid=4567 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:36.944000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6139326237306465373938633432323837636631623032396539646236 Jan 15 01:20:36.944000 audit: BPF prog-id=184 op=LOAD Jan 15 01:20:36.944000 audit[4567]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=4557 pid=4567 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:36.944000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6139326237306465373938633432323837636631623032396539646236 Jan 15 01:20:37.019358 containerd[1715]: time="2026-01-15T01:20:37.019317227Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-srjh9,Uid:ddb26c79-6272-4ee5-ba41-ad8ec552e6c6,Namespace:calico-system,Attempt:0,} returns sandbox id \"a92b70de798c42287cf1b029e9db65123cf942cb0e3a9a9b62fd427fdad24a4e\"" Jan 15 01:20:37.021705 containerd[1715]: time="2026-01-15T01:20:37.021603064Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 15 01:20:37.106000 audit: BPF prog-id=185 op=LOAD Jan 15 01:20:37.106000 audit[4632]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffe318d4510 a2=98 a3=1fffffffffffffff items=0 ppid=4600 pid=4632 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:37.106000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 15 01:20:37.107000 audit: BPF prog-id=185 op=UNLOAD Jan 15 01:20:37.107000 audit[4632]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7ffe318d44e0 a3=0 items=0 ppid=4600 pid=4632 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:37.107000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 15 01:20:37.107000 audit: BPF prog-id=186 op=LOAD Jan 15 01:20:37.107000 audit[4632]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffe318d43f0 a2=94 a3=3 items=0 ppid=4600 pid=4632 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:37.107000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 15 01:20:37.107000 audit: BPF prog-id=186 op=UNLOAD Jan 15 01:20:37.107000 audit[4632]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7ffe318d43f0 a2=94 a3=3 items=0 ppid=4600 pid=4632 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:37.107000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 15 01:20:37.107000 audit: BPF prog-id=187 op=LOAD Jan 15 01:20:37.107000 audit[4632]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffe318d4430 a2=94 a3=7ffe318d4610 items=0 ppid=4600 pid=4632 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:37.107000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 15 01:20:37.107000 audit: BPF prog-id=187 op=UNLOAD Jan 15 01:20:37.107000 audit[4632]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7ffe318d4430 a2=94 a3=7ffe318d4610 items=0 ppid=4600 pid=4632 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:37.107000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 15 01:20:37.109000 audit: BPF prog-id=188 op=LOAD Jan 15 01:20:37.109000 audit[4633]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffcfef59310 a2=98 a3=3 items=0 ppid=4600 pid=4633 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:37.109000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 15 01:20:37.109000 audit: BPF prog-id=188 op=UNLOAD Jan 15 01:20:37.109000 audit[4633]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7ffcfef592e0 a3=0 items=0 ppid=4600 pid=4633 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:37.109000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 15 01:20:37.109000 audit: BPF prog-id=189 op=LOAD Jan 15 01:20:37.109000 audit[4633]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffcfef59100 a2=94 a3=54428f items=0 ppid=4600 pid=4633 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:37.109000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 15 01:20:37.110000 audit: BPF prog-id=189 op=UNLOAD Jan 15 01:20:37.110000 audit[4633]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffcfef59100 a2=94 a3=54428f items=0 ppid=4600 pid=4633 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:37.110000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 15 01:20:37.110000 audit: BPF prog-id=190 op=LOAD Jan 15 01:20:37.110000 audit[4633]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffcfef59130 a2=94 a3=2 items=0 ppid=4600 pid=4633 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:37.110000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 15 01:20:37.110000 audit: BPF prog-id=190 op=UNLOAD Jan 15 01:20:37.110000 audit[4633]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffcfef59130 a2=0 a3=2 items=0 ppid=4600 pid=4633 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:37.110000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 15 01:20:37.280000 audit: BPF prog-id=191 op=LOAD Jan 15 01:20:37.280000 audit[4633]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffcfef58ff0 a2=94 a3=1 items=0 ppid=4600 pid=4633 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:37.280000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 15 01:20:37.280000 audit: BPF prog-id=191 op=UNLOAD Jan 15 01:20:37.280000 audit[4633]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffcfef58ff0 a2=94 a3=1 items=0 ppid=4600 pid=4633 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:37.280000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 15 01:20:37.292000 audit: BPF prog-id=192 op=LOAD Jan 15 01:20:37.292000 audit[4633]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffcfef58fe0 a2=94 a3=4 items=0 ppid=4600 pid=4633 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:37.292000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 15 01:20:37.292000 audit: BPF prog-id=192 op=UNLOAD Jan 15 01:20:37.292000 audit[4633]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=7ffcfef58fe0 a2=0 a3=4 items=0 ppid=4600 pid=4633 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:37.292000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 15 01:20:37.293000 audit: BPF prog-id=193 op=LOAD Jan 15 01:20:37.293000 audit[4633]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffcfef58e40 a2=94 a3=5 items=0 ppid=4600 pid=4633 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:37.293000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 15 01:20:37.293000 audit: BPF prog-id=193 op=UNLOAD Jan 15 01:20:37.293000 audit[4633]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7ffcfef58e40 a2=0 a3=5 items=0 ppid=4600 pid=4633 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:37.293000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 15 01:20:37.293000 audit: BPF prog-id=194 op=LOAD Jan 15 01:20:37.293000 audit[4633]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffcfef59060 a2=94 a3=6 items=0 ppid=4600 pid=4633 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:37.293000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 15 01:20:37.293000 audit: BPF prog-id=194 op=UNLOAD Jan 15 01:20:37.293000 audit[4633]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=7ffcfef59060 a2=0 a3=6 items=0 ppid=4600 pid=4633 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:37.293000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 15 01:20:37.293000 audit: BPF prog-id=195 op=LOAD Jan 15 01:20:37.293000 audit[4633]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffcfef58810 a2=94 a3=88 items=0 ppid=4600 pid=4633 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:37.293000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 15 01:20:37.293000 audit: BPF prog-id=196 op=LOAD Jan 15 01:20:37.293000 audit[4633]: SYSCALL arch=c000003e syscall=321 success=yes exit=7 a0=5 a1=7ffcfef58690 a2=94 a3=2 items=0 ppid=4600 pid=4633 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:37.293000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 15 01:20:37.293000 audit: BPF prog-id=196 op=UNLOAD Jan 15 01:20:37.293000 audit[4633]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=7 a1=7ffcfef586c0 a2=0 a3=7ffcfef587c0 items=0 ppid=4600 pid=4633 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:37.293000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 15 01:20:37.293000 audit: BPF prog-id=195 op=UNLOAD Jan 15 01:20:37.293000 audit[4633]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=1cd65d10 a2=0 a3=fdcb16160a602acf items=0 ppid=4600 pid=4633 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:37.293000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 15 01:20:37.301000 audit: BPF prog-id=197 op=LOAD Jan 15 01:20:37.301000 audit[4636]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7fffd72dfe60 a2=98 a3=1999999999999999 items=0 ppid=4600 pid=4636 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:37.301000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 15 01:20:37.301000 audit: BPF prog-id=197 op=UNLOAD Jan 15 01:20:37.301000 audit[4636]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7fffd72dfe30 a3=0 items=0 ppid=4600 pid=4636 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:37.301000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 15 01:20:37.301000 audit: BPF prog-id=198 op=LOAD Jan 15 01:20:37.301000 audit[4636]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7fffd72dfd40 a2=94 a3=ffff items=0 ppid=4600 pid=4636 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:37.301000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 15 01:20:37.301000 audit: BPF prog-id=198 op=UNLOAD Jan 15 01:20:37.301000 audit[4636]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7fffd72dfd40 a2=94 a3=ffff items=0 ppid=4600 pid=4636 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:37.301000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 15 01:20:37.301000 audit: BPF prog-id=199 op=LOAD Jan 15 01:20:37.301000 audit[4636]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7fffd72dfd80 a2=94 a3=7fffd72dff60 items=0 ppid=4600 pid=4636 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:37.301000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 15 01:20:37.302000 audit: BPF prog-id=199 op=UNLOAD Jan 15 01:20:37.302000 audit[4636]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7fffd72dfd80 a2=94 a3=7fffd72dff60 items=0 ppid=4600 pid=4636 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:37.302000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 15 01:20:37.370489 containerd[1715]: time="2026-01-15T01:20:37.370179675Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 01:20:37.379027 containerd[1715]: time="2026-01-15T01:20:37.378956536Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 15 01:20:37.379027 containerd[1715]: time="2026-01-15T01:20:37.378995396Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 15 01:20:37.379849 kubelet[3275]: E0115 01:20:37.379438 3275 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 15 01:20:37.381947 kubelet[3275]: E0115 01:20:37.380294 3275 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 15 01:20:37.381947 kubelet[3275]: E0115 01:20:37.380423 3275 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6lc26,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-srjh9_calico-system(ddb26c79-6272-4ee5-ba41-ad8ec552e6c6): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 15 01:20:37.385853 containerd[1715]: time="2026-01-15T01:20:37.383746515Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 15 01:20:37.386113 systemd-networkd[1599]: vxlan.calico: Link UP Jan 15 01:20:37.386543 systemd-networkd[1599]: vxlan.calico: Gained carrier Jan 15 01:20:37.423000 audit: BPF prog-id=200 op=LOAD Jan 15 01:20:37.423000 audit[4664]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffcb2160ed0 a2=98 a3=0 items=0 ppid=4600 pid=4664 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:37.423000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 15 01:20:37.423000 audit: BPF prog-id=200 op=UNLOAD Jan 15 01:20:37.423000 audit[4664]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7ffcb2160ea0 a3=0 items=0 ppid=4600 pid=4664 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:37.423000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 15 01:20:37.423000 audit: BPF prog-id=201 op=LOAD Jan 15 01:20:37.423000 audit[4664]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffcb2160ce0 a2=94 a3=54428f items=0 ppid=4600 pid=4664 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:37.423000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 15 01:20:37.423000 audit: BPF prog-id=201 op=UNLOAD Jan 15 01:20:37.423000 audit[4664]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7ffcb2160ce0 a2=94 a3=54428f items=0 ppid=4600 pid=4664 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:37.423000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 15 01:20:37.423000 audit: BPF prog-id=202 op=LOAD Jan 15 01:20:37.423000 audit[4664]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffcb2160d10 a2=94 a3=2 items=0 ppid=4600 pid=4664 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:37.423000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 15 01:20:37.423000 audit: BPF prog-id=202 op=UNLOAD Jan 15 01:20:37.423000 audit[4664]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7ffcb2160d10 a2=0 a3=2 items=0 ppid=4600 pid=4664 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:37.423000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 15 01:20:37.423000 audit: BPF prog-id=203 op=LOAD Jan 15 01:20:37.423000 audit[4664]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffcb2160ac0 a2=94 a3=4 items=0 ppid=4600 pid=4664 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:37.423000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 15 01:20:37.423000 audit: BPF prog-id=203 op=UNLOAD Jan 15 01:20:37.423000 audit[4664]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7ffcb2160ac0 a2=94 a3=4 items=0 ppid=4600 pid=4664 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:37.423000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 15 01:20:37.423000 audit: BPF prog-id=204 op=LOAD Jan 15 01:20:37.423000 audit[4664]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffcb2160bc0 a2=94 a3=7ffcb2160d40 items=0 ppid=4600 pid=4664 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:37.423000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 15 01:20:37.423000 audit: BPF prog-id=204 op=UNLOAD Jan 15 01:20:37.423000 audit[4664]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7ffcb2160bc0 a2=0 a3=7ffcb2160d40 items=0 ppid=4600 pid=4664 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:37.423000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 15 01:20:37.427000 audit: BPF prog-id=205 op=LOAD Jan 15 01:20:37.427000 audit[4664]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffcb21602f0 a2=94 a3=2 items=0 ppid=4600 pid=4664 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:37.427000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 15 01:20:37.427000 audit: BPF prog-id=205 op=UNLOAD Jan 15 01:20:37.427000 audit[4664]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7ffcb21602f0 a2=0 a3=2 items=0 ppid=4600 pid=4664 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:37.427000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 15 01:20:37.427000 audit: BPF prog-id=206 op=LOAD Jan 15 01:20:37.427000 audit[4664]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffcb21603f0 a2=94 a3=30 items=0 ppid=4600 pid=4664 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:37.427000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 15 01:20:37.436000 audit: BPF prog-id=207 op=LOAD Jan 15 01:20:37.436000 audit[4667]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffd29bff8b0 a2=98 a3=0 items=0 ppid=4600 pid=4667 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:37.436000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 15 01:20:37.436000 audit: BPF prog-id=207 op=UNLOAD Jan 15 01:20:37.436000 audit[4667]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7ffd29bff880 a3=0 items=0 ppid=4600 pid=4667 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:37.436000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 15 01:20:37.436000 audit: BPF prog-id=208 op=LOAD Jan 15 01:20:37.436000 audit[4667]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffd29bff6a0 a2=94 a3=54428f items=0 ppid=4600 pid=4667 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:37.436000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 15 01:20:37.436000 audit: BPF prog-id=208 op=UNLOAD Jan 15 01:20:37.436000 audit[4667]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffd29bff6a0 a2=94 a3=54428f items=0 ppid=4600 pid=4667 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:37.436000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 15 01:20:37.436000 audit: BPF prog-id=209 op=LOAD Jan 15 01:20:37.436000 audit[4667]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffd29bff6d0 a2=94 a3=2 items=0 ppid=4600 pid=4667 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:37.436000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 15 01:20:37.436000 audit: BPF prog-id=209 op=UNLOAD Jan 15 01:20:37.436000 audit[4667]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffd29bff6d0 a2=0 a3=2 items=0 ppid=4600 pid=4667 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:37.436000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 15 01:20:37.494969 kubelet[3275]: I0115 01:20:37.494930 3275 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 15 01:20:37.644000 audit: BPF prog-id=210 op=LOAD Jan 15 01:20:37.644000 audit[4667]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffd29bff590 a2=94 a3=1 items=0 ppid=4600 pid=4667 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:37.644000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 15 01:20:37.644000 audit: BPF prog-id=210 op=UNLOAD Jan 15 01:20:37.644000 audit[4667]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffd29bff590 a2=94 a3=1 items=0 ppid=4600 pid=4667 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:37.644000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 15 01:20:37.656000 audit: BPF prog-id=211 op=LOAD Jan 15 01:20:37.656000 audit[4667]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffd29bff580 a2=94 a3=4 items=0 ppid=4600 pid=4667 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:37.656000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 15 01:20:37.657000 audit: BPF prog-id=211 op=UNLOAD Jan 15 01:20:37.657000 audit[4667]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=7ffd29bff580 a2=0 a3=4 items=0 ppid=4600 pid=4667 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:37.657000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 15 01:20:37.657000 audit: BPF prog-id=212 op=LOAD Jan 15 01:20:37.657000 audit[4667]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffd29bff3e0 a2=94 a3=5 items=0 ppid=4600 pid=4667 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:37.657000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 15 01:20:37.658000 audit: BPF prog-id=212 op=UNLOAD Jan 15 01:20:37.658000 audit[4667]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7ffd29bff3e0 a2=0 a3=5 items=0 ppid=4600 pid=4667 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:37.658000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 15 01:20:37.658000 audit: BPF prog-id=213 op=LOAD Jan 15 01:20:37.658000 audit[4667]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffd29bff600 a2=94 a3=6 items=0 ppid=4600 pid=4667 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:37.658000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 15 01:20:37.658000 audit: BPF prog-id=213 op=UNLOAD Jan 15 01:20:37.658000 audit[4667]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=7ffd29bff600 a2=0 a3=6 items=0 ppid=4600 pid=4667 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:37.658000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 15 01:20:37.658000 audit: BPF prog-id=214 op=LOAD Jan 15 01:20:37.658000 audit[4667]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffd29bfedb0 a2=94 a3=88 items=0 ppid=4600 pid=4667 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:37.658000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 15 01:20:37.658000 audit: BPF prog-id=215 op=LOAD Jan 15 01:20:37.658000 audit[4667]: SYSCALL arch=c000003e syscall=321 success=yes exit=7 a0=5 a1=7ffd29bfec30 a2=94 a3=2 items=0 ppid=4600 pid=4667 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:37.658000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 15 01:20:37.658000 audit: BPF prog-id=215 op=UNLOAD Jan 15 01:20:37.658000 audit[4667]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=7 a1=7ffd29bfec60 a2=0 a3=7ffd29bfed60 items=0 ppid=4600 pid=4667 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:37.658000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 15 01:20:37.660000 audit: BPF prog-id=214 op=UNLOAD Jan 15 01:20:37.660000 audit[4667]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=34072d10 a2=0 a3=dbb6236a411415c items=0 ppid=4600 pid=4667 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:37.660000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 15 01:20:37.668000 audit: BPF prog-id=206 op=UNLOAD Jan 15 01:20:37.668000 audit[4600]: SYSCALL arch=c000003e syscall=263 success=yes exit=0 a0=ffffffffffffff9c a1=c0008ea200 a2=0 a3=0 items=0 ppid=4367 pid=4600 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="calico-node" exe="/usr/bin/calico-node" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:37.668000 audit: PROCTITLE proctitle=63616C69636F2D6E6F6465002D66656C6978 Jan 15 01:20:37.691965 containerd[1715]: time="2026-01-15T01:20:37.691758236Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-9c45d7c9c-l7rhq,Uid:0d14115d-26fb-4eac-a6b9-b5aa96406bb8,Namespace:calico-system,Attempt:0,}" Jan 15 01:20:37.692685 containerd[1715]: time="2026-01-15T01:20:37.692452576Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5b8cdf5dcc-grnmf,Uid:fc1c990f-1003-460d-a72d-34a2a5fb4d83,Namespace:calico-apiserver,Attempt:0,}" Jan 15 01:20:37.692685 containerd[1715]: time="2026-01-15T01:20:37.692510780Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5b8cdf5dcc-9rs84,Uid:6ebcb69d-4b71-4631-9554-a5f179cc05ba,Namespace:calico-apiserver,Attempt:0,}" Jan 15 01:20:37.757907 containerd[1715]: time="2026-01-15T01:20:37.757828273Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 01:20:38.172720 containerd[1715]: time="2026-01-15T01:20:38.172633418Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 15 01:20:38.173463 containerd[1715]: time="2026-01-15T01:20:38.172931652Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 15 01:20:38.173511 kubelet[3275]: E0115 01:20:38.173066 3275 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 15 01:20:38.173511 kubelet[3275]: E0115 01:20:38.173104 3275 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 15 01:20:38.173511 kubelet[3275]: E0115 01:20:38.173204 3275 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6lc26,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-srjh9_calico-system(ddb26c79-6272-4ee5-ba41-ad8ec552e6c6): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 15 01:20:38.174498 kubelet[3275]: E0115 01:20:38.174467 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-srjh9" podUID="ddb26c79-6272-4ee5-ba41-ad8ec552e6c6" Jan 15 01:20:38.230000 audit[4767]: NETFILTER_CFG table=nat:121 family=2 entries=15 op=nft_register_chain pid=4767 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 15 01:20:38.230000 audit[4767]: SYSCALL arch=c000003e syscall=46 success=yes exit=5084 a0=3 a1=7fff0fe95690 a2=0 a3=7fff0fe9567c items=0 ppid=4600 pid=4767 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:38.230000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 15 01:20:38.260000 audit[4765]: NETFILTER_CFG table=raw:122 family=2 entries=21 op=nft_register_chain pid=4765 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 15 01:20:38.260000 audit[4765]: SYSCALL arch=c000003e syscall=46 success=yes exit=8452 a0=3 a1=7ffe3badf900 a2=0 a3=7ffe3badf8ec items=0 ppid=4600 pid=4765 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:38.260000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 15 01:20:38.264000 audit[4771]: NETFILTER_CFG table=mangle:123 family=2 entries=16 op=nft_register_chain pid=4771 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 15 01:20:38.264000 audit[4771]: SYSCALL arch=c000003e syscall=46 success=yes exit=6868 a0=3 a1=7ffe581d1460 a2=0 a3=7ffe581d144c items=0 ppid=4600 pid=4771 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:38.264000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 15 01:20:38.280000 audit[4778]: NETFILTER_CFG table=filter:124 family=2 entries=122 op=nft_register_chain pid=4778 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 15 01:20:38.280000 audit[4778]: SYSCALL arch=c000003e syscall=46 success=yes exit=69792 a0=3 a1=7fffe097a6f0 a2=0 a3=7fffe097a6dc items=0 ppid=4600 pid=4778 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:38.280000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 15 01:20:38.363885 systemd-networkd[1599]: cali978989ccb0f: Link UP Jan 15 01:20:38.364422 systemd-networkd[1599]: cali978989ccb0f: Gained carrier Jan 15 01:20:38.384287 containerd[1715]: 2026-01-15 01:20:38.264 [INFO][4740] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4515--1--0--n--d76f075714-k8s-calico--kube--controllers--9c45d7c9c--l7rhq-eth0 calico-kube-controllers-9c45d7c9c- calico-system 0d14115d-26fb-4eac-a6b9-b5aa96406bb8 799 0 2026-01-15 01:20:12 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:9c45d7c9c projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4515-1-0-n-d76f075714 calico-kube-controllers-9c45d7c9c-l7rhq eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali978989ccb0f [] [] }} ContainerID="9df20cc7a5dd788b927021a7f08cdd8ce41085252a5f1b8422be7a87319f2efe" Namespace="calico-system" Pod="calico-kube-controllers-9c45d7c9c-l7rhq" WorkloadEndpoint="ci--4515--1--0--n--d76f075714-k8s-calico--kube--controllers--9c45d7c9c--l7rhq-" Jan 15 01:20:38.384287 containerd[1715]: 2026-01-15 01:20:38.264 [INFO][4740] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="9df20cc7a5dd788b927021a7f08cdd8ce41085252a5f1b8422be7a87319f2efe" Namespace="calico-system" Pod="calico-kube-controllers-9c45d7c9c-l7rhq" WorkloadEndpoint="ci--4515--1--0--n--d76f075714-k8s-calico--kube--controllers--9c45d7c9c--l7rhq-eth0" Jan 15 01:20:38.384287 containerd[1715]: 2026-01-15 01:20:38.307 [INFO][4786] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9df20cc7a5dd788b927021a7f08cdd8ce41085252a5f1b8422be7a87319f2efe" HandleID="k8s-pod-network.9df20cc7a5dd788b927021a7f08cdd8ce41085252a5f1b8422be7a87319f2efe" Workload="ci--4515--1--0--n--d76f075714-k8s-calico--kube--controllers--9c45d7c9c--l7rhq-eth0" Jan 15 01:20:38.385024 containerd[1715]: 2026-01-15 01:20:38.307 [INFO][4786] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="9df20cc7a5dd788b927021a7f08cdd8ce41085252a5f1b8422be7a87319f2efe" HandleID="k8s-pod-network.9df20cc7a5dd788b927021a7f08cdd8ce41085252a5f1b8422be7a87319f2efe" Workload="ci--4515--1--0--n--d76f075714-k8s-calico--kube--controllers--9c45d7c9c--l7rhq-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5800), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4515-1-0-n-d76f075714", "pod":"calico-kube-controllers-9c45d7c9c-l7rhq", "timestamp":"2026-01-15 01:20:38.30706491 +0000 UTC"}, Hostname:"ci-4515-1-0-n-d76f075714", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 15 01:20:38.385024 containerd[1715]: 2026-01-15 01:20:38.307 [INFO][4786] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 15 01:20:38.385024 containerd[1715]: 2026-01-15 01:20:38.307 [INFO][4786] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 15 01:20:38.385024 containerd[1715]: 2026-01-15 01:20:38.307 [INFO][4786] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4515-1-0-n-d76f075714' Jan 15 01:20:38.385024 containerd[1715]: 2026-01-15 01:20:38.316 [INFO][4786] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.9df20cc7a5dd788b927021a7f08cdd8ce41085252a5f1b8422be7a87319f2efe" host="ci-4515-1-0-n-d76f075714" Jan 15 01:20:38.385024 containerd[1715]: 2026-01-15 01:20:38.332 [INFO][4786] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4515-1-0-n-d76f075714" Jan 15 01:20:38.385024 containerd[1715]: 2026-01-15 01:20:38.340 [INFO][4786] ipam/ipam.go 511: Trying affinity for 192.168.72.0/26 host="ci-4515-1-0-n-d76f075714" Jan 15 01:20:38.385024 containerd[1715]: 2026-01-15 01:20:38.342 [INFO][4786] ipam/ipam.go 158: Attempting to load block cidr=192.168.72.0/26 host="ci-4515-1-0-n-d76f075714" Jan 15 01:20:38.385024 containerd[1715]: 2026-01-15 01:20:38.344 [INFO][4786] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.72.0/26 host="ci-4515-1-0-n-d76f075714" Jan 15 01:20:38.386203 containerd[1715]: 2026-01-15 01:20:38.344 [INFO][4786] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.72.0/26 handle="k8s-pod-network.9df20cc7a5dd788b927021a7f08cdd8ce41085252a5f1b8422be7a87319f2efe" host="ci-4515-1-0-n-d76f075714" Jan 15 01:20:38.386203 containerd[1715]: 2026-01-15 01:20:38.345 [INFO][4786] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.9df20cc7a5dd788b927021a7f08cdd8ce41085252a5f1b8422be7a87319f2efe Jan 15 01:20:38.386203 containerd[1715]: 2026-01-15 01:20:38.351 [INFO][4786] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.72.0/26 handle="k8s-pod-network.9df20cc7a5dd788b927021a7f08cdd8ce41085252a5f1b8422be7a87319f2efe" host="ci-4515-1-0-n-d76f075714" Jan 15 01:20:38.386203 containerd[1715]: 2026-01-15 01:20:38.357 [INFO][4786] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.72.3/26] block=192.168.72.0/26 handle="k8s-pod-network.9df20cc7a5dd788b927021a7f08cdd8ce41085252a5f1b8422be7a87319f2efe" host="ci-4515-1-0-n-d76f075714" Jan 15 01:20:38.386203 containerd[1715]: 2026-01-15 01:20:38.357 [INFO][4786] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.72.3/26] handle="k8s-pod-network.9df20cc7a5dd788b927021a7f08cdd8ce41085252a5f1b8422be7a87319f2efe" host="ci-4515-1-0-n-d76f075714" Jan 15 01:20:38.386203 containerd[1715]: 2026-01-15 01:20:38.357 [INFO][4786] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 15 01:20:38.386203 containerd[1715]: 2026-01-15 01:20:38.357 [INFO][4786] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.72.3/26] IPv6=[] ContainerID="9df20cc7a5dd788b927021a7f08cdd8ce41085252a5f1b8422be7a87319f2efe" HandleID="k8s-pod-network.9df20cc7a5dd788b927021a7f08cdd8ce41085252a5f1b8422be7a87319f2efe" Workload="ci--4515--1--0--n--d76f075714-k8s-calico--kube--controllers--9c45d7c9c--l7rhq-eth0" Jan 15 01:20:38.386376 containerd[1715]: 2026-01-15 01:20:38.361 [INFO][4740] cni-plugin/k8s.go 418: Populated endpoint ContainerID="9df20cc7a5dd788b927021a7f08cdd8ce41085252a5f1b8422be7a87319f2efe" Namespace="calico-system" Pod="calico-kube-controllers-9c45d7c9c-l7rhq" WorkloadEndpoint="ci--4515--1--0--n--d76f075714-k8s-calico--kube--controllers--9c45d7c9c--l7rhq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--n--d76f075714-k8s-calico--kube--controllers--9c45d7c9c--l7rhq-eth0", GenerateName:"calico-kube-controllers-9c45d7c9c-", Namespace:"calico-system", SelfLink:"", UID:"0d14115d-26fb-4eac-a6b9-b5aa96406bb8", ResourceVersion:"799", Generation:0, CreationTimestamp:time.Date(2026, time.January, 15, 1, 20, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"9c45d7c9c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-n-d76f075714", ContainerID:"", Pod:"calico-kube-controllers-9c45d7c9c-l7rhq", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.72.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali978989ccb0f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 15 01:20:38.386454 containerd[1715]: 2026-01-15 01:20:38.361 [INFO][4740] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.72.3/32] ContainerID="9df20cc7a5dd788b927021a7f08cdd8ce41085252a5f1b8422be7a87319f2efe" Namespace="calico-system" Pod="calico-kube-controllers-9c45d7c9c-l7rhq" WorkloadEndpoint="ci--4515--1--0--n--d76f075714-k8s-calico--kube--controllers--9c45d7c9c--l7rhq-eth0" Jan 15 01:20:38.386454 containerd[1715]: 2026-01-15 01:20:38.361 [INFO][4740] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali978989ccb0f ContainerID="9df20cc7a5dd788b927021a7f08cdd8ce41085252a5f1b8422be7a87319f2efe" Namespace="calico-system" Pod="calico-kube-controllers-9c45d7c9c-l7rhq" WorkloadEndpoint="ci--4515--1--0--n--d76f075714-k8s-calico--kube--controllers--9c45d7c9c--l7rhq-eth0" Jan 15 01:20:38.386454 containerd[1715]: 2026-01-15 01:20:38.364 [INFO][4740] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="9df20cc7a5dd788b927021a7f08cdd8ce41085252a5f1b8422be7a87319f2efe" Namespace="calico-system" Pod="calico-kube-controllers-9c45d7c9c-l7rhq" WorkloadEndpoint="ci--4515--1--0--n--d76f075714-k8s-calico--kube--controllers--9c45d7c9c--l7rhq-eth0" Jan 15 01:20:38.386532 containerd[1715]: 2026-01-15 01:20:38.365 [INFO][4740] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="9df20cc7a5dd788b927021a7f08cdd8ce41085252a5f1b8422be7a87319f2efe" Namespace="calico-system" Pod="calico-kube-controllers-9c45d7c9c-l7rhq" WorkloadEndpoint="ci--4515--1--0--n--d76f075714-k8s-calico--kube--controllers--9c45d7c9c--l7rhq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--n--d76f075714-k8s-calico--kube--controllers--9c45d7c9c--l7rhq-eth0", GenerateName:"calico-kube-controllers-9c45d7c9c-", Namespace:"calico-system", SelfLink:"", UID:"0d14115d-26fb-4eac-a6b9-b5aa96406bb8", ResourceVersion:"799", Generation:0, CreationTimestamp:time.Date(2026, time.January, 15, 1, 20, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"9c45d7c9c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-n-d76f075714", ContainerID:"9df20cc7a5dd788b927021a7f08cdd8ce41085252a5f1b8422be7a87319f2efe", Pod:"calico-kube-controllers-9c45d7c9c-l7rhq", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.72.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali978989ccb0f", MAC:"22:5b:1a:72:52:bc", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 15 01:20:38.386595 containerd[1715]: 2026-01-15 01:20:38.379 [INFO][4740] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="9df20cc7a5dd788b927021a7f08cdd8ce41085252a5f1b8422be7a87319f2efe" Namespace="calico-system" Pod="calico-kube-controllers-9c45d7c9c-l7rhq" WorkloadEndpoint="ci--4515--1--0--n--d76f075714-k8s-calico--kube--controllers--9c45d7c9c--l7rhq-eth0" Jan 15 01:20:38.393000 audit[4817]: NETFILTER_CFG table=filter:125 family=2 entries=46 op=nft_register_chain pid=4817 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 15 01:20:38.393000 audit[4817]: SYSCALL arch=c000003e syscall=46 success=yes exit=23616 a0=3 a1=7ffe76302420 a2=0 a3=7ffe7630240c items=0 ppid=4600 pid=4817 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:38.393000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 15 01:20:38.408263 containerd[1715]: time="2026-01-15T01:20:38.408193339Z" level=info msg="connecting to shim 9df20cc7a5dd788b927021a7f08cdd8ce41085252a5f1b8422be7a87319f2efe" address="unix:///run/containerd/s/b1e8de68a49310b3ed57824cc280f9fb48067a6eb8c7606ed01f63b7e3185125" namespace=k8s.io protocol=ttrpc version=3 Jan 15 01:20:38.438482 systemd[1]: Started cri-containerd-9df20cc7a5dd788b927021a7f08cdd8ce41085252a5f1b8422be7a87319f2efe.scope - libcontainer container 9df20cc7a5dd788b927021a7f08cdd8ce41085252a5f1b8422be7a87319f2efe. Jan 15 01:20:38.459000 audit: BPF prog-id=216 op=LOAD Jan 15 01:20:38.461000 audit: BPF prog-id=217 op=LOAD Jan 15 01:20:38.461000 audit[4838]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000138238 a2=98 a3=0 items=0 ppid=4827 pid=4838 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:38.461000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3964663230636337613564643738386239323730323161376630386364 Jan 15 01:20:38.461000 audit: BPF prog-id=217 op=UNLOAD Jan 15 01:20:38.461000 audit[4838]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4827 pid=4838 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:38.461000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3964663230636337613564643738386239323730323161376630386364 Jan 15 01:20:38.461000 audit: BPF prog-id=218 op=LOAD Jan 15 01:20:38.461000 audit[4838]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000138488 a2=98 a3=0 items=0 ppid=4827 pid=4838 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:38.461000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3964663230636337613564643738386239323730323161376630386364 Jan 15 01:20:38.461000 audit: BPF prog-id=219 op=LOAD Jan 15 01:20:38.461000 audit[4838]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000138218 a2=98 a3=0 items=0 ppid=4827 pid=4838 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:38.461000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3964663230636337613564643738386239323730323161376630386364 Jan 15 01:20:38.461000 audit: BPF prog-id=219 op=UNLOAD Jan 15 01:20:38.461000 audit[4838]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4827 pid=4838 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:38.461000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3964663230636337613564643738386239323730323161376630386364 Jan 15 01:20:38.461000 audit: BPF prog-id=218 op=UNLOAD Jan 15 01:20:38.461000 audit[4838]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4827 pid=4838 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:38.461000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3964663230636337613564643738386239323730323161376630386364 Jan 15 01:20:38.461000 audit: BPF prog-id=220 op=LOAD Jan 15 01:20:38.461000 audit[4838]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001386e8 a2=98 a3=0 items=0 ppid=4827 pid=4838 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:38.461000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3964663230636337613564643738386239323730323161376630386364 Jan 15 01:20:38.478077 systemd-networkd[1599]: calia1efb9fd50f: Link UP Jan 15 01:20:38.480923 systemd-networkd[1599]: calia1efb9fd50f: Gained carrier Jan 15 01:20:38.503340 containerd[1715]: 2026-01-15 01:20:38.272 [INFO][4737] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4515--1--0--n--d76f075714-k8s-calico--apiserver--5b8cdf5dcc--9rs84-eth0 calico-apiserver-5b8cdf5dcc- calico-apiserver 6ebcb69d-4b71-4631-9554-a5f179cc05ba 798 0 2026-01-15 01:20:07 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:5b8cdf5dcc projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4515-1-0-n-d76f075714 calico-apiserver-5b8cdf5dcc-9rs84 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calia1efb9fd50f [] [] }} ContainerID="d3cf7ecce8fb53e12f148d1e8771c29e29afbb26ab18a6f89d31c7e7c9337fb6" Namespace="calico-apiserver" Pod="calico-apiserver-5b8cdf5dcc-9rs84" WorkloadEndpoint="ci--4515--1--0--n--d76f075714-k8s-calico--apiserver--5b8cdf5dcc--9rs84-" Jan 15 01:20:38.503340 containerd[1715]: 2026-01-15 01:20:38.273 [INFO][4737] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d3cf7ecce8fb53e12f148d1e8771c29e29afbb26ab18a6f89d31c7e7c9337fb6" Namespace="calico-apiserver" Pod="calico-apiserver-5b8cdf5dcc-9rs84" WorkloadEndpoint="ci--4515--1--0--n--d76f075714-k8s-calico--apiserver--5b8cdf5dcc--9rs84-eth0" Jan 15 01:20:38.503340 containerd[1715]: 2026-01-15 01:20:38.332 [INFO][4791] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d3cf7ecce8fb53e12f148d1e8771c29e29afbb26ab18a6f89d31c7e7c9337fb6" HandleID="k8s-pod-network.d3cf7ecce8fb53e12f148d1e8771c29e29afbb26ab18a6f89d31c7e7c9337fb6" Workload="ci--4515--1--0--n--d76f075714-k8s-calico--apiserver--5b8cdf5dcc--9rs84-eth0" Jan 15 01:20:38.503544 containerd[1715]: 2026-01-15 01:20:38.332 [INFO][4791] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="d3cf7ecce8fb53e12f148d1e8771c29e29afbb26ab18a6f89d31c7e7c9337fb6" HandleID="k8s-pod-network.d3cf7ecce8fb53e12f148d1e8771c29e29afbb26ab18a6f89d31c7e7c9337fb6" Workload="ci--4515--1--0--n--d76f075714-k8s-calico--apiserver--5b8cdf5dcc--9rs84-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002cbad0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4515-1-0-n-d76f075714", "pod":"calico-apiserver-5b8cdf5dcc-9rs84", "timestamp":"2026-01-15 01:20:38.332361726 +0000 UTC"}, Hostname:"ci-4515-1-0-n-d76f075714", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 15 01:20:38.503544 containerd[1715]: 2026-01-15 01:20:38.332 [INFO][4791] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 15 01:20:38.503544 containerd[1715]: 2026-01-15 01:20:38.357 [INFO][4791] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 15 01:20:38.503544 containerd[1715]: 2026-01-15 01:20:38.358 [INFO][4791] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4515-1-0-n-d76f075714' Jan 15 01:20:38.503544 containerd[1715]: 2026-01-15 01:20:38.419 [INFO][4791] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.d3cf7ecce8fb53e12f148d1e8771c29e29afbb26ab18a6f89d31c7e7c9337fb6" host="ci-4515-1-0-n-d76f075714" Jan 15 01:20:38.503544 containerd[1715]: 2026-01-15 01:20:38.431 [INFO][4791] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4515-1-0-n-d76f075714" Jan 15 01:20:38.503544 containerd[1715]: 2026-01-15 01:20:38.442 [INFO][4791] ipam/ipam.go 511: Trying affinity for 192.168.72.0/26 host="ci-4515-1-0-n-d76f075714" Jan 15 01:20:38.503544 containerd[1715]: 2026-01-15 01:20:38.446 [INFO][4791] ipam/ipam.go 158: Attempting to load block cidr=192.168.72.0/26 host="ci-4515-1-0-n-d76f075714" Jan 15 01:20:38.503544 containerd[1715]: 2026-01-15 01:20:38.449 [INFO][4791] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.72.0/26 host="ci-4515-1-0-n-d76f075714" Jan 15 01:20:38.503732 containerd[1715]: 2026-01-15 01:20:38.449 [INFO][4791] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.72.0/26 handle="k8s-pod-network.d3cf7ecce8fb53e12f148d1e8771c29e29afbb26ab18a6f89d31c7e7c9337fb6" host="ci-4515-1-0-n-d76f075714" Jan 15 01:20:38.503732 containerd[1715]: 2026-01-15 01:20:38.454 [INFO][4791] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.d3cf7ecce8fb53e12f148d1e8771c29e29afbb26ab18a6f89d31c7e7c9337fb6 Jan 15 01:20:38.503732 containerd[1715]: 2026-01-15 01:20:38.459 [INFO][4791] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.72.0/26 handle="k8s-pod-network.d3cf7ecce8fb53e12f148d1e8771c29e29afbb26ab18a6f89d31c7e7c9337fb6" host="ci-4515-1-0-n-d76f075714" Jan 15 01:20:38.503732 containerd[1715]: 2026-01-15 01:20:38.468 [INFO][4791] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.72.4/26] block=192.168.72.0/26 handle="k8s-pod-network.d3cf7ecce8fb53e12f148d1e8771c29e29afbb26ab18a6f89d31c7e7c9337fb6" host="ci-4515-1-0-n-d76f075714" Jan 15 01:20:38.503732 containerd[1715]: 2026-01-15 01:20:38.468 [INFO][4791] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.72.4/26] handle="k8s-pod-network.d3cf7ecce8fb53e12f148d1e8771c29e29afbb26ab18a6f89d31c7e7c9337fb6" host="ci-4515-1-0-n-d76f075714" Jan 15 01:20:38.503732 containerd[1715]: 2026-01-15 01:20:38.468 [INFO][4791] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 15 01:20:38.503732 containerd[1715]: 2026-01-15 01:20:38.468 [INFO][4791] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.72.4/26] IPv6=[] ContainerID="d3cf7ecce8fb53e12f148d1e8771c29e29afbb26ab18a6f89d31c7e7c9337fb6" HandleID="k8s-pod-network.d3cf7ecce8fb53e12f148d1e8771c29e29afbb26ab18a6f89d31c7e7c9337fb6" Workload="ci--4515--1--0--n--d76f075714-k8s-calico--apiserver--5b8cdf5dcc--9rs84-eth0" Jan 15 01:20:38.503860 containerd[1715]: 2026-01-15 01:20:38.473 [INFO][4737] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d3cf7ecce8fb53e12f148d1e8771c29e29afbb26ab18a6f89d31c7e7c9337fb6" Namespace="calico-apiserver" Pod="calico-apiserver-5b8cdf5dcc-9rs84" WorkloadEndpoint="ci--4515--1--0--n--d76f075714-k8s-calico--apiserver--5b8cdf5dcc--9rs84-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--n--d76f075714-k8s-calico--apiserver--5b8cdf5dcc--9rs84-eth0", GenerateName:"calico-apiserver-5b8cdf5dcc-", Namespace:"calico-apiserver", SelfLink:"", UID:"6ebcb69d-4b71-4631-9554-a5f179cc05ba", ResourceVersion:"798", Generation:0, CreationTimestamp:time.Date(2026, time.January, 15, 1, 20, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5b8cdf5dcc", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-n-d76f075714", ContainerID:"", Pod:"calico-apiserver-5b8cdf5dcc-9rs84", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.72.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calia1efb9fd50f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 15 01:20:38.503910 containerd[1715]: 2026-01-15 01:20:38.473 [INFO][4737] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.72.4/32] ContainerID="d3cf7ecce8fb53e12f148d1e8771c29e29afbb26ab18a6f89d31c7e7c9337fb6" Namespace="calico-apiserver" Pod="calico-apiserver-5b8cdf5dcc-9rs84" WorkloadEndpoint="ci--4515--1--0--n--d76f075714-k8s-calico--apiserver--5b8cdf5dcc--9rs84-eth0" Jan 15 01:20:38.503910 containerd[1715]: 2026-01-15 01:20:38.474 [INFO][4737] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia1efb9fd50f ContainerID="d3cf7ecce8fb53e12f148d1e8771c29e29afbb26ab18a6f89d31c7e7c9337fb6" Namespace="calico-apiserver" Pod="calico-apiserver-5b8cdf5dcc-9rs84" WorkloadEndpoint="ci--4515--1--0--n--d76f075714-k8s-calico--apiserver--5b8cdf5dcc--9rs84-eth0" Jan 15 01:20:38.503910 containerd[1715]: 2026-01-15 01:20:38.484 [INFO][4737] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d3cf7ecce8fb53e12f148d1e8771c29e29afbb26ab18a6f89d31c7e7c9337fb6" Namespace="calico-apiserver" Pod="calico-apiserver-5b8cdf5dcc-9rs84" WorkloadEndpoint="ci--4515--1--0--n--d76f075714-k8s-calico--apiserver--5b8cdf5dcc--9rs84-eth0" Jan 15 01:20:38.503972 containerd[1715]: 2026-01-15 01:20:38.488 [INFO][4737] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d3cf7ecce8fb53e12f148d1e8771c29e29afbb26ab18a6f89d31c7e7c9337fb6" Namespace="calico-apiserver" Pod="calico-apiserver-5b8cdf5dcc-9rs84" WorkloadEndpoint="ci--4515--1--0--n--d76f075714-k8s-calico--apiserver--5b8cdf5dcc--9rs84-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--n--d76f075714-k8s-calico--apiserver--5b8cdf5dcc--9rs84-eth0", GenerateName:"calico-apiserver-5b8cdf5dcc-", Namespace:"calico-apiserver", SelfLink:"", UID:"6ebcb69d-4b71-4631-9554-a5f179cc05ba", ResourceVersion:"798", Generation:0, CreationTimestamp:time.Date(2026, time.January, 15, 1, 20, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5b8cdf5dcc", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-n-d76f075714", ContainerID:"d3cf7ecce8fb53e12f148d1e8771c29e29afbb26ab18a6f89d31c7e7c9337fb6", Pod:"calico-apiserver-5b8cdf5dcc-9rs84", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.72.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calia1efb9fd50f", MAC:"2e:8d:f5:6d:75:78", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 15 01:20:38.505083 containerd[1715]: 2026-01-15 01:20:38.499 [INFO][4737] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d3cf7ecce8fb53e12f148d1e8771c29e29afbb26ab18a6f89d31c7e7c9337fb6" Namespace="calico-apiserver" Pod="calico-apiserver-5b8cdf5dcc-9rs84" WorkloadEndpoint="ci--4515--1--0--n--d76f075714-k8s-calico--apiserver--5b8cdf5dcc--9rs84-eth0" Jan 15 01:20:38.534805 containerd[1715]: time="2026-01-15T01:20:38.534746748Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-9c45d7c9c-l7rhq,Uid:0d14115d-26fb-4eac-a6b9-b5aa96406bb8,Namespace:calico-system,Attempt:0,} returns sandbox id \"9df20cc7a5dd788b927021a7f08cdd8ce41085252a5f1b8422be7a87319f2efe\"" Jan 15 01:20:38.537564 containerd[1715]: time="2026-01-15T01:20:38.537457935Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 15 01:20:38.550478 containerd[1715]: time="2026-01-15T01:20:38.550436139Z" level=info msg="connecting to shim d3cf7ecce8fb53e12f148d1e8771c29e29afbb26ab18a6f89d31c7e7c9337fb6" address="unix:///run/containerd/s/f39e93d5af5bf3fe7ba91699a742bd4202694b8cb65989e234df2e1c59471c88" namespace=k8s.io protocol=ttrpc version=3 Jan 15 01:20:38.583997 systemd-networkd[1599]: calidfac3594a5b: Link UP Jan 15 01:20:38.588201 systemd[1]: Started cri-containerd-d3cf7ecce8fb53e12f148d1e8771c29e29afbb26ab18a6f89d31c7e7c9337fb6.scope - libcontainer container d3cf7ecce8fb53e12f148d1e8771c29e29afbb26ab18a6f89d31c7e7c9337fb6. Jan 15 01:20:38.594597 systemd-networkd[1599]: calidfac3594a5b: Gained carrier Jan 15 01:20:38.613392 containerd[1715]: 2026-01-15 01:20:38.281 [INFO][4760] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4515--1--0--n--d76f075714-k8s-calico--apiserver--5b8cdf5dcc--grnmf-eth0 calico-apiserver-5b8cdf5dcc- calico-apiserver fc1c990f-1003-460d-a72d-34a2a5fb4d83 795 0 2026-01-15 01:20:07 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:5b8cdf5dcc projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4515-1-0-n-d76f075714 calico-apiserver-5b8cdf5dcc-grnmf eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calidfac3594a5b [] [] }} ContainerID="eb16d5bd27eaa9e7305e77b59d66d888f22181a25dcdb4a161e00c1152606372" Namespace="calico-apiserver" Pod="calico-apiserver-5b8cdf5dcc-grnmf" WorkloadEndpoint="ci--4515--1--0--n--d76f075714-k8s-calico--apiserver--5b8cdf5dcc--grnmf-" Jan 15 01:20:38.613392 containerd[1715]: 2026-01-15 01:20:38.281 [INFO][4760] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="eb16d5bd27eaa9e7305e77b59d66d888f22181a25dcdb4a161e00c1152606372" Namespace="calico-apiserver" Pod="calico-apiserver-5b8cdf5dcc-grnmf" WorkloadEndpoint="ci--4515--1--0--n--d76f075714-k8s-calico--apiserver--5b8cdf5dcc--grnmf-eth0" Jan 15 01:20:38.613392 containerd[1715]: 2026-01-15 01:20:38.337 [INFO][4797] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="eb16d5bd27eaa9e7305e77b59d66d888f22181a25dcdb4a161e00c1152606372" HandleID="k8s-pod-network.eb16d5bd27eaa9e7305e77b59d66d888f22181a25dcdb4a161e00c1152606372" Workload="ci--4515--1--0--n--d76f075714-k8s-calico--apiserver--5b8cdf5dcc--grnmf-eth0" Jan 15 01:20:38.613581 containerd[1715]: 2026-01-15 01:20:38.337 [INFO][4797] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="eb16d5bd27eaa9e7305e77b59d66d888f22181a25dcdb4a161e00c1152606372" HandleID="k8s-pod-network.eb16d5bd27eaa9e7305e77b59d66d888f22181a25dcdb4a161e00c1152606372" Workload="ci--4515--1--0--n--d76f075714-k8s-calico--apiserver--5b8cdf5dcc--grnmf-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000373aa0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4515-1-0-n-d76f075714", "pod":"calico-apiserver-5b8cdf5dcc-grnmf", "timestamp":"2026-01-15 01:20:38.337522914 +0000 UTC"}, Hostname:"ci-4515-1-0-n-d76f075714", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 15 01:20:38.613581 containerd[1715]: 2026-01-15 01:20:38.337 [INFO][4797] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 15 01:20:38.613581 containerd[1715]: 2026-01-15 01:20:38.468 [INFO][4797] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 15 01:20:38.613581 containerd[1715]: 2026-01-15 01:20:38.468 [INFO][4797] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4515-1-0-n-d76f075714' Jan 15 01:20:38.613581 containerd[1715]: 2026-01-15 01:20:38.521 [INFO][4797] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.eb16d5bd27eaa9e7305e77b59d66d888f22181a25dcdb4a161e00c1152606372" host="ci-4515-1-0-n-d76f075714" Jan 15 01:20:38.613581 containerd[1715]: 2026-01-15 01:20:38.530 [INFO][4797] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4515-1-0-n-d76f075714" Jan 15 01:20:38.613581 containerd[1715]: 2026-01-15 01:20:38.543 [INFO][4797] ipam/ipam.go 511: Trying affinity for 192.168.72.0/26 host="ci-4515-1-0-n-d76f075714" Jan 15 01:20:38.613581 containerd[1715]: 2026-01-15 01:20:38.545 [INFO][4797] ipam/ipam.go 158: Attempting to load block cidr=192.168.72.0/26 host="ci-4515-1-0-n-d76f075714" Jan 15 01:20:38.613581 containerd[1715]: 2026-01-15 01:20:38.549 [INFO][4797] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.72.0/26 host="ci-4515-1-0-n-d76f075714" Jan 15 01:20:38.614151 containerd[1715]: 2026-01-15 01:20:38.551 [INFO][4797] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.72.0/26 handle="k8s-pod-network.eb16d5bd27eaa9e7305e77b59d66d888f22181a25dcdb4a161e00c1152606372" host="ci-4515-1-0-n-d76f075714" Jan 15 01:20:38.614151 containerd[1715]: 2026-01-15 01:20:38.552 [INFO][4797] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.eb16d5bd27eaa9e7305e77b59d66d888f22181a25dcdb4a161e00c1152606372 Jan 15 01:20:38.614151 containerd[1715]: 2026-01-15 01:20:38.561 [INFO][4797] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.72.0/26 handle="k8s-pod-network.eb16d5bd27eaa9e7305e77b59d66d888f22181a25dcdb4a161e00c1152606372" host="ci-4515-1-0-n-d76f075714" Jan 15 01:20:38.614151 containerd[1715]: 2026-01-15 01:20:38.571 [INFO][4797] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.72.5/26] block=192.168.72.0/26 handle="k8s-pod-network.eb16d5bd27eaa9e7305e77b59d66d888f22181a25dcdb4a161e00c1152606372" host="ci-4515-1-0-n-d76f075714" Jan 15 01:20:38.614151 containerd[1715]: 2026-01-15 01:20:38.572 [INFO][4797] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.72.5/26] handle="k8s-pod-network.eb16d5bd27eaa9e7305e77b59d66d888f22181a25dcdb4a161e00c1152606372" host="ci-4515-1-0-n-d76f075714" Jan 15 01:20:38.614151 containerd[1715]: 2026-01-15 01:20:38.572 [INFO][4797] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 15 01:20:38.614151 containerd[1715]: 2026-01-15 01:20:38.572 [INFO][4797] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.72.5/26] IPv6=[] ContainerID="eb16d5bd27eaa9e7305e77b59d66d888f22181a25dcdb4a161e00c1152606372" HandleID="k8s-pod-network.eb16d5bd27eaa9e7305e77b59d66d888f22181a25dcdb4a161e00c1152606372" Workload="ci--4515--1--0--n--d76f075714-k8s-calico--apiserver--5b8cdf5dcc--grnmf-eth0" Jan 15 01:20:38.614470 containerd[1715]: 2026-01-15 01:20:38.575 [INFO][4760] cni-plugin/k8s.go 418: Populated endpoint ContainerID="eb16d5bd27eaa9e7305e77b59d66d888f22181a25dcdb4a161e00c1152606372" Namespace="calico-apiserver" Pod="calico-apiserver-5b8cdf5dcc-grnmf" WorkloadEndpoint="ci--4515--1--0--n--d76f075714-k8s-calico--apiserver--5b8cdf5dcc--grnmf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--n--d76f075714-k8s-calico--apiserver--5b8cdf5dcc--grnmf-eth0", GenerateName:"calico-apiserver-5b8cdf5dcc-", Namespace:"calico-apiserver", SelfLink:"", UID:"fc1c990f-1003-460d-a72d-34a2a5fb4d83", ResourceVersion:"795", Generation:0, CreationTimestamp:time.Date(2026, time.January, 15, 1, 20, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5b8cdf5dcc", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-n-d76f075714", ContainerID:"", Pod:"calico-apiserver-5b8cdf5dcc-grnmf", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.72.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calidfac3594a5b", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 15 01:20:38.614712 containerd[1715]: 2026-01-15 01:20:38.577 [INFO][4760] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.72.5/32] ContainerID="eb16d5bd27eaa9e7305e77b59d66d888f22181a25dcdb4a161e00c1152606372" Namespace="calico-apiserver" Pod="calico-apiserver-5b8cdf5dcc-grnmf" WorkloadEndpoint="ci--4515--1--0--n--d76f075714-k8s-calico--apiserver--5b8cdf5dcc--grnmf-eth0" Jan 15 01:20:38.614712 containerd[1715]: 2026-01-15 01:20:38.578 [INFO][4760] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calidfac3594a5b ContainerID="eb16d5bd27eaa9e7305e77b59d66d888f22181a25dcdb4a161e00c1152606372" Namespace="calico-apiserver" Pod="calico-apiserver-5b8cdf5dcc-grnmf" WorkloadEndpoint="ci--4515--1--0--n--d76f075714-k8s-calico--apiserver--5b8cdf5dcc--grnmf-eth0" Jan 15 01:20:38.614712 containerd[1715]: 2026-01-15 01:20:38.599 [INFO][4760] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="eb16d5bd27eaa9e7305e77b59d66d888f22181a25dcdb4a161e00c1152606372" Namespace="calico-apiserver" Pod="calico-apiserver-5b8cdf5dcc-grnmf" WorkloadEndpoint="ci--4515--1--0--n--d76f075714-k8s-calico--apiserver--5b8cdf5dcc--grnmf-eth0" Jan 15 01:20:38.614792 containerd[1715]: 2026-01-15 01:20:38.601 [INFO][4760] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="eb16d5bd27eaa9e7305e77b59d66d888f22181a25dcdb4a161e00c1152606372" Namespace="calico-apiserver" Pod="calico-apiserver-5b8cdf5dcc-grnmf" WorkloadEndpoint="ci--4515--1--0--n--d76f075714-k8s-calico--apiserver--5b8cdf5dcc--grnmf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--n--d76f075714-k8s-calico--apiserver--5b8cdf5dcc--grnmf-eth0", GenerateName:"calico-apiserver-5b8cdf5dcc-", Namespace:"calico-apiserver", SelfLink:"", UID:"fc1c990f-1003-460d-a72d-34a2a5fb4d83", ResourceVersion:"795", Generation:0, CreationTimestamp:time.Date(2026, time.January, 15, 1, 20, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5b8cdf5dcc", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-n-d76f075714", ContainerID:"eb16d5bd27eaa9e7305e77b59d66d888f22181a25dcdb4a161e00c1152606372", Pod:"calico-apiserver-5b8cdf5dcc-grnmf", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.72.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calidfac3594a5b", MAC:"1e:59:61:57:0e:cf", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 15 01:20:38.614847 containerd[1715]: 2026-01-15 01:20:38.609 [INFO][4760] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="eb16d5bd27eaa9e7305e77b59d66d888f22181a25dcdb4a161e00c1152606372" Namespace="calico-apiserver" Pod="calico-apiserver-5b8cdf5dcc-grnmf" WorkloadEndpoint="ci--4515--1--0--n--d76f075714-k8s-calico--apiserver--5b8cdf5dcc--grnmf-eth0" Jan 15 01:20:38.630000 audit[4917]: NETFILTER_CFG table=filter:126 family=2 entries=60 op=nft_register_chain pid=4917 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 15 01:20:38.630000 audit[4917]: SYSCALL arch=c000003e syscall=46 success=yes exit=32232 a0=3 a1=7fff3fa8df90 a2=0 a3=7fff3fa8df7c items=0 ppid=4600 pid=4917 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:38.630000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 15 01:20:38.637000 audit: BPF prog-id=221 op=LOAD Jan 15 01:20:38.637000 audit: BPF prog-id=222 op=LOAD Jan 15 01:20:38.637000 audit[4890]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a0238 a2=98 a3=0 items=0 ppid=4879 pid=4890 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:38.637000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6433636637656363653866623533653132663134386431653837373163 Jan 15 01:20:38.637000 audit: BPF prog-id=222 op=UNLOAD Jan 15 01:20:38.637000 audit[4890]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4879 pid=4890 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:38.637000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6433636637656363653866623533653132663134386431653837373163 Jan 15 01:20:38.638000 audit: BPF prog-id=223 op=LOAD Jan 15 01:20:38.638000 audit[4890]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a0488 a2=98 a3=0 items=0 ppid=4879 pid=4890 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:38.638000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6433636637656363653866623533653132663134386431653837373163 Jan 15 01:20:38.638000 audit: BPF prog-id=224 op=LOAD Jan 15 01:20:38.638000 audit[4890]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c0001a0218 a2=98 a3=0 items=0 ppid=4879 pid=4890 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:38.638000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6433636637656363653866623533653132663134386431653837373163 Jan 15 01:20:38.638000 audit: BPF prog-id=224 op=UNLOAD Jan 15 01:20:38.638000 audit[4890]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4879 pid=4890 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:38.638000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6433636637656363653866623533653132663134386431653837373163 Jan 15 01:20:38.638000 audit: BPF prog-id=223 op=UNLOAD Jan 15 01:20:38.638000 audit[4890]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4879 pid=4890 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:38.638000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6433636637656363653866623533653132663134386431653837373163 Jan 15 01:20:38.638000 audit: BPF prog-id=225 op=LOAD Jan 15 01:20:38.638000 audit[4890]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a06e8 a2=98 a3=0 items=0 ppid=4879 pid=4890 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:38.638000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6433636637656363653866623533653132663134386431653837373163 Jan 15 01:20:38.646364 containerd[1715]: time="2026-01-15T01:20:38.646192648Z" level=info msg="connecting to shim eb16d5bd27eaa9e7305e77b59d66d888f22181a25dcdb4a161e00c1152606372" address="unix:///run/containerd/s/a8913f36a9ad2840b04267a7bcd9bcfb9b4a9dbef747b9b36c6c32f1ff24ad6d" namespace=k8s.io protocol=ttrpc version=3 Jan 15 01:20:38.674217 systemd[1]: Started cri-containerd-eb16d5bd27eaa9e7305e77b59d66d888f22181a25dcdb4a161e00c1152606372.scope - libcontainer container eb16d5bd27eaa9e7305e77b59d66d888f22181a25dcdb4a161e00c1152606372. Jan 15 01:20:38.693216 containerd[1715]: time="2026-01-15T01:20:38.693051632Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-2sh5x,Uid:1740dd66-3e8a-4fbd-88e8-778f39eb7186,Namespace:kube-system,Attempt:0,}" Jan 15 01:20:38.696421 containerd[1715]: time="2026-01-15T01:20:38.696393165Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5b8cdf5dcc-9rs84,Uid:6ebcb69d-4b71-4631-9554-a5f179cc05ba,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"d3cf7ecce8fb53e12f148d1e8771c29e29afbb26ab18a6f89d31c7e7c9337fb6\"" Jan 15 01:20:38.705000 audit[4966]: NETFILTER_CFG table=filter:127 family=2 entries=41 op=nft_register_chain pid=4966 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 15 01:20:38.705000 audit[4966]: SYSCALL arch=c000003e syscall=46 success=yes exit=23044 a0=3 a1=7ffefa638080 a2=0 a3=7ffefa63806c items=0 ppid=4600 pid=4966 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:38.705000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 15 01:20:38.707000 audit: BPF prog-id=226 op=LOAD Jan 15 01:20:38.707000 audit: BPF prog-id=227 op=LOAD Jan 15 01:20:38.707000 audit[4938]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a0238 a2=98 a3=0 items=0 ppid=4927 pid=4938 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:38.707000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6562313664356264323765616139653733303565373762353964363664 Jan 15 01:20:38.707000 audit: BPF prog-id=227 op=UNLOAD Jan 15 01:20:38.707000 audit[4938]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4927 pid=4938 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:38.707000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6562313664356264323765616139653733303565373762353964363664 Jan 15 01:20:38.708000 audit: BPF prog-id=228 op=LOAD Jan 15 01:20:38.708000 audit[4938]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a0488 a2=98 a3=0 items=0 ppid=4927 pid=4938 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:38.708000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6562313664356264323765616139653733303565373762353964363664 Jan 15 01:20:38.708000 audit: BPF prog-id=229 op=LOAD Jan 15 01:20:38.708000 audit[4938]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c0001a0218 a2=98 a3=0 items=0 ppid=4927 pid=4938 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:38.708000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6562313664356264323765616139653733303565373762353964363664 Jan 15 01:20:38.708000 audit: BPF prog-id=229 op=UNLOAD Jan 15 01:20:38.708000 audit[4938]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4927 pid=4938 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:38.708000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6562313664356264323765616139653733303565373762353964363664 Jan 15 01:20:38.708000 audit: BPF prog-id=228 op=UNLOAD Jan 15 01:20:38.708000 audit[4938]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4927 pid=4938 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:38.708000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6562313664356264323765616139653733303565373762353964363664 Jan 15 01:20:38.708000 audit: BPF prog-id=230 op=LOAD Jan 15 01:20:38.708000 audit[4938]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a06e8 a2=98 a3=0 items=0 ppid=4927 pid=4938 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:38.708000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6562313664356264323765616139653733303565373762353964363664 Jan 15 01:20:38.769344 containerd[1715]: time="2026-01-15T01:20:38.769193631Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5b8cdf5dcc-grnmf,Uid:fc1c990f-1003-460d-a72d-34a2a5fb4d83,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"eb16d5bd27eaa9e7305e77b59d66d888f22181a25dcdb4a161e00c1152606372\"" Jan 15 01:20:38.832182 systemd-networkd[1599]: cali3dde3515453: Link UP Jan 15 01:20:38.832925 systemd-networkd[1599]: cali3dde3515453: Gained carrier Jan 15 01:20:38.846872 containerd[1715]: 2026-01-15 01:20:38.748 [INFO][4967] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4515--1--0--n--d76f075714-k8s-coredns--668d6bf9bc--2sh5x-eth0 coredns-668d6bf9bc- kube-system 1740dd66-3e8a-4fbd-88e8-778f39eb7186 797 0 2026-01-15 01:19:58 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4515-1-0-n-d76f075714 coredns-668d6bf9bc-2sh5x eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali3dde3515453 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="2858095dbff5853633bdc43ffd0cff287ff8805206b6b286f3892783102a6bce" Namespace="kube-system" Pod="coredns-668d6bf9bc-2sh5x" WorkloadEndpoint="ci--4515--1--0--n--d76f075714-k8s-coredns--668d6bf9bc--2sh5x-" Jan 15 01:20:38.846872 containerd[1715]: 2026-01-15 01:20:38.750 [INFO][4967] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="2858095dbff5853633bdc43ffd0cff287ff8805206b6b286f3892783102a6bce" Namespace="kube-system" Pod="coredns-668d6bf9bc-2sh5x" WorkloadEndpoint="ci--4515--1--0--n--d76f075714-k8s-coredns--668d6bf9bc--2sh5x-eth0" Jan 15 01:20:38.846872 containerd[1715]: 2026-01-15 01:20:38.785 [INFO][4985] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="2858095dbff5853633bdc43ffd0cff287ff8805206b6b286f3892783102a6bce" HandleID="k8s-pod-network.2858095dbff5853633bdc43ffd0cff287ff8805206b6b286f3892783102a6bce" Workload="ci--4515--1--0--n--d76f075714-k8s-coredns--668d6bf9bc--2sh5x-eth0" Jan 15 01:20:38.847257 containerd[1715]: 2026-01-15 01:20:38.786 [INFO][4985] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="2858095dbff5853633bdc43ffd0cff287ff8805206b6b286f3892783102a6bce" HandleID="k8s-pod-network.2858095dbff5853633bdc43ffd0cff287ff8805206b6b286f3892783102a6bce" Workload="ci--4515--1--0--n--d76f075714-k8s-coredns--668d6bf9bc--2sh5x-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f220), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4515-1-0-n-d76f075714", "pod":"coredns-668d6bf9bc-2sh5x", "timestamp":"2026-01-15 01:20:38.785947268 +0000 UTC"}, Hostname:"ci-4515-1-0-n-d76f075714", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 15 01:20:38.847257 containerd[1715]: 2026-01-15 01:20:38.786 [INFO][4985] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 15 01:20:38.847257 containerd[1715]: 2026-01-15 01:20:38.786 [INFO][4985] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 15 01:20:38.847257 containerd[1715]: 2026-01-15 01:20:38.786 [INFO][4985] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4515-1-0-n-d76f075714' Jan 15 01:20:38.847257 containerd[1715]: 2026-01-15 01:20:38.792 [INFO][4985] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.2858095dbff5853633bdc43ffd0cff287ff8805206b6b286f3892783102a6bce" host="ci-4515-1-0-n-d76f075714" Jan 15 01:20:38.847257 containerd[1715]: 2026-01-15 01:20:38.797 [INFO][4985] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4515-1-0-n-d76f075714" Jan 15 01:20:38.847257 containerd[1715]: 2026-01-15 01:20:38.802 [INFO][4985] ipam/ipam.go 511: Trying affinity for 192.168.72.0/26 host="ci-4515-1-0-n-d76f075714" Jan 15 01:20:38.847257 containerd[1715]: 2026-01-15 01:20:38.805 [INFO][4985] ipam/ipam.go 158: Attempting to load block cidr=192.168.72.0/26 host="ci-4515-1-0-n-d76f075714" Jan 15 01:20:38.847257 containerd[1715]: 2026-01-15 01:20:38.808 [INFO][4985] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.72.0/26 host="ci-4515-1-0-n-d76f075714" Jan 15 01:20:38.847458 containerd[1715]: 2026-01-15 01:20:38.808 [INFO][4985] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.72.0/26 handle="k8s-pod-network.2858095dbff5853633bdc43ffd0cff287ff8805206b6b286f3892783102a6bce" host="ci-4515-1-0-n-d76f075714" Jan 15 01:20:38.847458 containerd[1715]: 2026-01-15 01:20:38.811 [INFO][4985] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.2858095dbff5853633bdc43ffd0cff287ff8805206b6b286f3892783102a6bce Jan 15 01:20:38.847458 containerd[1715]: 2026-01-15 01:20:38.818 [INFO][4985] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.72.0/26 handle="k8s-pod-network.2858095dbff5853633bdc43ffd0cff287ff8805206b6b286f3892783102a6bce" host="ci-4515-1-0-n-d76f075714" Jan 15 01:20:38.847458 containerd[1715]: 2026-01-15 01:20:38.827 [INFO][4985] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.72.6/26] block=192.168.72.0/26 handle="k8s-pod-network.2858095dbff5853633bdc43ffd0cff287ff8805206b6b286f3892783102a6bce" host="ci-4515-1-0-n-d76f075714" Jan 15 01:20:38.847458 containerd[1715]: 2026-01-15 01:20:38.827 [INFO][4985] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.72.6/26] handle="k8s-pod-network.2858095dbff5853633bdc43ffd0cff287ff8805206b6b286f3892783102a6bce" host="ci-4515-1-0-n-d76f075714" Jan 15 01:20:38.847458 containerd[1715]: 2026-01-15 01:20:38.827 [INFO][4985] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 15 01:20:38.847458 containerd[1715]: 2026-01-15 01:20:38.827 [INFO][4985] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.72.6/26] IPv6=[] ContainerID="2858095dbff5853633bdc43ffd0cff287ff8805206b6b286f3892783102a6bce" HandleID="k8s-pod-network.2858095dbff5853633bdc43ffd0cff287ff8805206b6b286f3892783102a6bce" Workload="ci--4515--1--0--n--d76f075714-k8s-coredns--668d6bf9bc--2sh5x-eth0" Jan 15 01:20:38.847593 containerd[1715]: 2026-01-15 01:20:38.829 [INFO][4967] cni-plugin/k8s.go 418: Populated endpoint ContainerID="2858095dbff5853633bdc43ffd0cff287ff8805206b6b286f3892783102a6bce" Namespace="kube-system" Pod="coredns-668d6bf9bc-2sh5x" WorkloadEndpoint="ci--4515--1--0--n--d76f075714-k8s-coredns--668d6bf9bc--2sh5x-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--n--d76f075714-k8s-coredns--668d6bf9bc--2sh5x-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"1740dd66-3e8a-4fbd-88e8-778f39eb7186", ResourceVersion:"797", Generation:0, CreationTimestamp:time.Date(2026, time.January, 15, 1, 19, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-n-d76f075714", ContainerID:"", Pod:"coredns-668d6bf9bc-2sh5x", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.72.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali3dde3515453", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 15 01:20:38.847593 containerd[1715]: 2026-01-15 01:20:38.829 [INFO][4967] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.72.6/32] ContainerID="2858095dbff5853633bdc43ffd0cff287ff8805206b6b286f3892783102a6bce" Namespace="kube-system" Pod="coredns-668d6bf9bc-2sh5x" WorkloadEndpoint="ci--4515--1--0--n--d76f075714-k8s-coredns--668d6bf9bc--2sh5x-eth0" Jan 15 01:20:38.847593 containerd[1715]: 2026-01-15 01:20:38.829 [INFO][4967] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali3dde3515453 ContainerID="2858095dbff5853633bdc43ffd0cff287ff8805206b6b286f3892783102a6bce" Namespace="kube-system" Pod="coredns-668d6bf9bc-2sh5x" WorkloadEndpoint="ci--4515--1--0--n--d76f075714-k8s-coredns--668d6bf9bc--2sh5x-eth0" Jan 15 01:20:38.847593 containerd[1715]: 2026-01-15 01:20:38.833 [INFO][4967] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="2858095dbff5853633bdc43ffd0cff287ff8805206b6b286f3892783102a6bce" Namespace="kube-system" Pod="coredns-668d6bf9bc-2sh5x" WorkloadEndpoint="ci--4515--1--0--n--d76f075714-k8s-coredns--668d6bf9bc--2sh5x-eth0" Jan 15 01:20:38.847593 containerd[1715]: 2026-01-15 01:20:38.833 [INFO][4967] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="2858095dbff5853633bdc43ffd0cff287ff8805206b6b286f3892783102a6bce" Namespace="kube-system" Pod="coredns-668d6bf9bc-2sh5x" WorkloadEndpoint="ci--4515--1--0--n--d76f075714-k8s-coredns--668d6bf9bc--2sh5x-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--n--d76f075714-k8s-coredns--668d6bf9bc--2sh5x-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"1740dd66-3e8a-4fbd-88e8-778f39eb7186", ResourceVersion:"797", Generation:0, CreationTimestamp:time.Date(2026, time.January, 15, 1, 19, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-n-d76f075714", ContainerID:"2858095dbff5853633bdc43ffd0cff287ff8805206b6b286f3892783102a6bce", Pod:"coredns-668d6bf9bc-2sh5x", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.72.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali3dde3515453", MAC:"8a:ca:30:4a:e0:2f", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 15 01:20:38.847593 containerd[1715]: 2026-01-15 01:20:38.842 [INFO][4967] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="2858095dbff5853633bdc43ffd0cff287ff8805206b6b286f3892783102a6bce" Namespace="kube-system" Pod="coredns-668d6bf9bc-2sh5x" WorkloadEndpoint="ci--4515--1--0--n--d76f075714-k8s-coredns--668d6bf9bc--2sh5x-eth0" Jan 15 01:20:38.854888 kubelet[3275]: E0115 01:20:38.853303 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-srjh9" podUID="ddb26c79-6272-4ee5-ba41-ad8ec552e6c6" Jan 15 01:20:38.874328 containerd[1715]: time="2026-01-15T01:20:38.874251929Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 01:20:38.875975 containerd[1715]: time="2026-01-15T01:20:38.875940633Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 15 01:20:38.876211 containerd[1715]: time="2026-01-15T01:20:38.876196478Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 15 01:20:38.876545 kubelet[3275]: E0115 01:20:38.876397 3275 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 15 01:20:38.876545 kubelet[3275]: E0115 01:20:38.876446 3275 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 15 01:20:38.876879 kubelet[3275]: E0115 01:20:38.876737 3275 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hxp5p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-9c45d7c9c-l7rhq_calico-system(0d14115d-26fb-4eac-a6b9-b5aa96406bb8): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 15 01:20:38.877862 containerd[1715]: time="2026-01-15T01:20:38.877423797Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 15 01:20:38.878024 kubelet[3275]: E0115 01:20:38.877974 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-9c45d7c9c-l7rhq" podUID="0d14115d-26fb-4eac-a6b9-b5aa96406bb8" Jan 15 01:20:38.878938 containerd[1715]: time="2026-01-15T01:20:38.878900837Z" level=info msg="connecting to shim 2858095dbff5853633bdc43ffd0cff287ff8805206b6b286f3892783102a6bce" address="unix:///run/containerd/s/70ddd3dd7dc59a51a06a1bcbdc2cd5a159adc79d28b9392808078e00ed9299da" namespace=k8s.io protocol=ttrpc version=3 Jan 15 01:20:38.907201 systemd-networkd[1599]: caliab882494aee: Gained IPv6LL Jan 15 01:20:38.910000 audit[5034]: NETFILTER_CFG table=filter:128 family=2 entries=50 op=nft_register_chain pid=5034 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 15 01:20:38.911305 systemd[1]: Started cri-containerd-2858095dbff5853633bdc43ffd0cff287ff8805206b6b286f3892783102a6bce.scope - libcontainer container 2858095dbff5853633bdc43ffd0cff287ff8805206b6b286f3892783102a6bce. Jan 15 01:20:38.910000 audit[5034]: SYSCALL arch=c000003e syscall=46 success=yes exit=24896 a0=3 a1=7fff8648f160 a2=0 a3=7fff8648f14c items=0 ppid=4600 pid=5034 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:38.910000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 15 01:20:38.927000 audit: BPF prog-id=231 op=LOAD Jan 15 01:20:38.927000 audit: BPF prog-id=232 op=LOAD Jan 15 01:20:38.927000 audit[5019]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000178238 a2=98 a3=0 items=0 ppid=5008 pid=5019 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:38.927000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3238353830393564626666353835333633336264633433666664306366 Jan 15 01:20:38.927000 audit: BPF prog-id=232 op=UNLOAD Jan 15 01:20:38.927000 audit[5019]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=5008 pid=5019 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:38.927000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3238353830393564626666353835333633336264633433666664306366 Jan 15 01:20:38.927000 audit: BPF prog-id=233 op=LOAD Jan 15 01:20:38.927000 audit[5019]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000178488 a2=98 a3=0 items=0 ppid=5008 pid=5019 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:38.927000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3238353830393564626666353835333633336264633433666664306366 Jan 15 01:20:38.927000 audit: BPF prog-id=234 op=LOAD Jan 15 01:20:38.927000 audit[5019]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000178218 a2=98 a3=0 items=0 ppid=5008 pid=5019 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:38.927000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3238353830393564626666353835333633336264633433666664306366 Jan 15 01:20:38.927000 audit: BPF prog-id=234 op=UNLOAD Jan 15 01:20:38.927000 audit[5019]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=5008 pid=5019 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:38.927000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3238353830393564626666353835333633336264633433666664306366 Jan 15 01:20:38.927000 audit: BPF prog-id=233 op=UNLOAD Jan 15 01:20:38.927000 audit[5019]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=5008 pid=5019 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:38.927000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3238353830393564626666353835333633336264633433666664306366 Jan 15 01:20:38.927000 audit: BPF prog-id=235 op=LOAD Jan 15 01:20:38.927000 audit[5019]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001786e8 a2=98 a3=0 items=0 ppid=5008 pid=5019 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:38.927000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3238353830393564626666353835333633336264633433666664306366 Jan 15 01:20:38.964904 containerd[1715]: time="2026-01-15T01:20:38.964734742Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-2sh5x,Uid:1740dd66-3e8a-4fbd-88e8-778f39eb7186,Namespace:kube-system,Attempt:0,} returns sandbox id \"2858095dbff5853633bdc43ffd0cff287ff8805206b6b286f3892783102a6bce\"" Jan 15 01:20:38.968404 containerd[1715]: time="2026-01-15T01:20:38.968369188Z" level=info msg="CreateContainer within sandbox \"2858095dbff5853633bdc43ffd0cff287ff8805206b6b286f3892783102a6bce\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 15 01:20:38.979032 containerd[1715]: time="2026-01-15T01:20:38.978928924Z" level=info msg="Container a06a9520da530af5014ffe81baa3ee11451d8f8b8bbaec4a74478e6b7c33d1c2: CDI devices from CRI Config.CDIDevices: []" Jan 15 01:20:38.988385 containerd[1715]: time="2026-01-15T01:20:38.988355752Z" level=info msg="CreateContainer within sandbox \"2858095dbff5853633bdc43ffd0cff287ff8805206b6b286f3892783102a6bce\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"a06a9520da530af5014ffe81baa3ee11451d8f8b8bbaec4a74478e6b7c33d1c2\"" Jan 15 01:20:38.989882 containerd[1715]: time="2026-01-15T01:20:38.989816193Z" level=info msg="StartContainer for \"a06a9520da530af5014ffe81baa3ee11451d8f8b8bbaec4a74478e6b7c33d1c2\"" Jan 15 01:20:38.990744 containerd[1715]: time="2026-01-15T01:20:38.990696135Z" level=info msg="connecting to shim a06a9520da530af5014ffe81baa3ee11451d8f8b8bbaec4a74478e6b7c33d1c2" address="unix:///run/containerd/s/70ddd3dd7dc59a51a06a1bcbdc2cd5a159adc79d28b9392808078e00ed9299da" protocol=ttrpc version=3 Jan 15 01:20:39.016275 systemd[1]: Started cri-containerd-a06a9520da530af5014ffe81baa3ee11451d8f8b8bbaec4a74478e6b7c33d1c2.scope - libcontainer container a06a9520da530af5014ffe81baa3ee11451d8f8b8bbaec4a74478e6b7c33d1c2. Jan 15 01:20:39.031080 kernel: kauditd_printk_skb: 359 callbacks suppressed Jan 15 01:20:39.031175 kernel: audit: type=1334 audit(1768440039.028:701): prog-id=236 op=LOAD Jan 15 01:20:39.028000 audit: BPF prog-id=236 op=LOAD Jan 15 01:20:39.029000 audit: BPF prog-id=237 op=LOAD Jan 15 01:20:39.033079 kernel: audit: type=1334 audit(1768440039.029:702): prog-id=237 op=LOAD Jan 15 01:20:39.029000 audit[5050]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000178238 a2=98 a3=0 items=0 ppid=5008 pid=5050 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:39.037087 kernel: audit: type=1300 audit(1768440039.029:702): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000178238 a2=98 a3=0 items=0 ppid=5008 pid=5050 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:39.029000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6130366139353230646135333061663530313466666538316261613365 Jan 15 01:20:39.041969 kernel: audit: type=1327 audit(1768440039.029:702): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6130366139353230646135333061663530313466666538316261613365 Jan 15 01:20:39.042034 kernel: audit: type=1334 audit(1768440039.029:703): prog-id=237 op=UNLOAD Jan 15 01:20:39.029000 audit: BPF prog-id=237 op=UNLOAD Jan 15 01:20:39.029000 audit[5050]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=5008 pid=5050 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:39.049249 kernel: audit: type=1300 audit(1768440039.029:703): arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=5008 pid=5050 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:39.049333 kernel: audit: type=1327 audit(1768440039.029:703): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6130366139353230646135333061663530313466666538316261613365 Jan 15 01:20:39.029000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6130366139353230646135333061663530313466666538316261613365 Jan 15 01:20:39.050823 kernel: audit: type=1334 audit(1768440039.029:704): prog-id=238 op=LOAD Jan 15 01:20:39.029000 audit: BPF prog-id=238 op=LOAD Jan 15 01:20:39.029000 audit[5050]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000178488 a2=98 a3=0 items=0 ppid=5008 pid=5050 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:39.055034 kernel: audit: type=1300 audit(1768440039.029:704): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000178488 a2=98 a3=0 items=0 ppid=5008 pid=5050 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:39.029000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6130366139353230646135333061663530313466666538316261613365 Jan 15 01:20:39.029000 audit: BPF prog-id=239 op=LOAD Jan 15 01:20:39.029000 audit[5050]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000178218 a2=98 a3=0 items=0 ppid=5008 pid=5050 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:39.029000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6130366139353230646135333061663530313466666538316261613365 Jan 15 01:20:39.029000 audit: BPF prog-id=239 op=UNLOAD Jan 15 01:20:39.029000 audit[5050]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=5008 pid=5050 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:39.059081 kernel: audit: type=1327 audit(1768440039.029:704): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6130366139353230646135333061663530313466666538316261613365 Jan 15 01:20:39.029000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6130366139353230646135333061663530313466666538316261613365 Jan 15 01:20:39.029000 audit: BPF prog-id=238 op=UNLOAD Jan 15 01:20:39.029000 audit[5050]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=5008 pid=5050 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:39.029000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6130366139353230646135333061663530313466666538316261613365 Jan 15 01:20:39.029000 audit: BPF prog-id=240 op=LOAD Jan 15 01:20:39.029000 audit[5050]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001786e8 a2=98 a3=0 items=0 ppid=5008 pid=5050 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:39.029000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6130366139353230646135333061663530313466666538316261613365 Jan 15 01:20:39.069579 containerd[1715]: time="2026-01-15T01:20:39.069545097Z" level=info msg="StartContainer for \"a06a9520da530af5014ffe81baa3ee11451d8f8b8bbaec4a74478e6b7c33d1c2\" returns successfully" Jan 15 01:20:39.221100 containerd[1715]: time="2026-01-15T01:20:39.220750839Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 01:20:39.222318 containerd[1715]: time="2026-01-15T01:20:39.222274839Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 15 01:20:39.222420 containerd[1715]: time="2026-01-15T01:20:39.222275425Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 15 01:20:39.222542 kubelet[3275]: E0115 01:20:39.222480 3275 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 01:20:39.223087 kubelet[3275]: E0115 01:20:39.222555 3275 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 01:20:39.223087 kubelet[3275]: E0115 01:20:39.222782 3275 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kcc99,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-5b8cdf5dcc-9rs84_calico-apiserver(6ebcb69d-4b71-4631-9554-a5f179cc05ba): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 15 01:20:39.224197 containerd[1715]: time="2026-01-15T01:20:39.222936763Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 15 01:20:39.224292 kubelet[3275]: E0115 01:20:39.224216 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5b8cdf5dcc-9rs84" podUID="6ebcb69d-4b71-4631-9554-a5f179cc05ba" Jan 15 01:20:39.291323 systemd-networkd[1599]: vxlan.calico: Gained IPv6LL Jan 15 01:20:39.561371 containerd[1715]: time="2026-01-15T01:20:39.561197431Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 01:20:39.562997 containerd[1715]: time="2026-01-15T01:20:39.562931802Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 15 01:20:39.563133 containerd[1715]: time="2026-01-15T01:20:39.563054171Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 15 01:20:39.563271 kubelet[3275]: E0115 01:20:39.563236 3275 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 01:20:39.563382 kubelet[3275]: E0115 01:20:39.563289 3275 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 01:20:39.563510 kubelet[3275]: E0115 01:20:39.563437 3275 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mmtnw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-5b8cdf5dcc-grnmf_calico-apiserver(fc1c990f-1003-460d-a72d-34a2a5fb4d83): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 15 01:20:39.564690 kubelet[3275]: E0115 01:20:39.564633 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5b8cdf5dcc-grnmf" podUID="fc1c990f-1003-460d-a72d-34a2a5fb4d83" Jan 15 01:20:39.693630 containerd[1715]: time="2026-01-15T01:20:39.693369042Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-qxn98,Uid:19b9076e-57b5-41dc-b303-63eafd79e78c,Namespace:calico-system,Attempt:0,}" Jan 15 01:20:39.693630 containerd[1715]: time="2026-01-15T01:20:39.693369770Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-7zb8r,Uid:80ef1833-e881-4202-8a43-efb4cbd7eee4,Namespace:kube-system,Attempt:0,}" Jan 15 01:20:39.832326 systemd-networkd[1599]: cali3a68483fd23: Link UP Jan 15 01:20:39.833247 systemd-networkd[1599]: cali3a68483fd23: Gained carrier Jan 15 01:20:39.854400 containerd[1715]: 2026-01-15 01:20:39.740 [INFO][5080] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4515--1--0--n--d76f075714-k8s-goldmane--666569f655--qxn98-eth0 goldmane-666569f655- calico-system 19b9076e-57b5-41dc-b303-63eafd79e78c 800 0 2026-01-15 01:20:09 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:666569f655 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4515-1-0-n-d76f075714 goldmane-666569f655-qxn98 eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali3a68483fd23 [] [] }} ContainerID="c2d27de49cc1e5ff3a76a5306c6cb922bca01416daa097e131a5dfbddcc353f5" Namespace="calico-system" Pod="goldmane-666569f655-qxn98" WorkloadEndpoint="ci--4515--1--0--n--d76f075714-k8s-goldmane--666569f655--qxn98-" Jan 15 01:20:39.854400 containerd[1715]: 2026-01-15 01:20:39.740 [INFO][5080] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c2d27de49cc1e5ff3a76a5306c6cb922bca01416daa097e131a5dfbddcc353f5" Namespace="calico-system" Pod="goldmane-666569f655-qxn98" WorkloadEndpoint="ci--4515--1--0--n--d76f075714-k8s-goldmane--666569f655--qxn98-eth0" Jan 15 01:20:39.854400 containerd[1715]: 2026-01-15 01:20:39.777 [INFO][5103] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c2d27de49cc1e5ff3a76a5306c6cb922bca01416daa097e131a5dfbddcc353f5" HandleID="k8s-pod-network.c2d27de49cc1e5ff3a76a5306c6cb922bca01416daa097e131a5dfbddcc353f5" Workload="ci--4515--1--0--n--d76f075714-k8s-goldmane--666569f655--qxn98-eth0" Jan 15 01:20:39.854400 containerd[1715]: 2026-01-15 01:20:39.777 [INFO][5103] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="c2d27de49cc1e5ff3a76a5306c6cb922bca01416daa097e131a5dfbddcc353f5" HandleID="k8s-pod-network.c2d27de49cc1e5ff3a76a5306c6cb922bca01416daa097e131a5dfbddcc353f5" Workload="ci--4515--1--0--n--d76f075714-k8s-goldmane--666569f655--qxn98-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f860), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4515-1-0-n-d76f075714", "pod":"goldmane-666569f655-qxn98", "timestamp":"2026-01-15 01:20:39.777568669 +0000 UTC"}, Hostname:"ci-4515-1-0-n-d76f075714", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 15 01:20:39.854400 containerd[1715]: 2026-01-15 01:20:39.777 [INFO][5103] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 15 01:20:39.854400 containerd[1715]: 2026-01-15 01:20:39.777 [INFO][5103] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 15 01:20:39.854400 containerd[1715]: 2026-01-15 01:20:39.777 [INFO][5103] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4515-1-0-n-d76f075714' Jan 15 01:20:39.854400 containerd[1715]: 2026-01-15 01:20:39.787 [INFO][5103] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.c2d27de49cc1e5ff3a76a5306c6cb922bca01416daa097e131a5dfbddcc353f5" host="ci-4515-1-0-n-d76f075714" Jan 15 01:20:39.854400 containerd[1715]: 2026-01-15 01:20:39.793 [INFO][5103] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4515-1-0-n-d76f075714" Jan 15 01:20:39.854400 containerd[1715]: 2026-01-15 01:20:39.799 [INFO][5103] ipam/ipam.go 511: Trying affinity for 192.168.72.0/26 host="ci-4515-1-0-n-d76f075714" Jan 15 01:20:39.854400 containerd[1715]: 2026-01-15 01:20:39.802 [INFO][5103] ipam/ipam.go 158: Attempting to load block cidr=192.168.72.0/26 host="ci-4515-1-0-n-d76f075714" Jan 15 01:20:39.854400 containerd[1715]: 2026-01-15 01:20:39.806 [INFO][5103] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.72.0/26 host="ci-4515-1-0-n-d76f075714" Jan 15 01:20:39.854400 containerd[1715]: 2026-01-15 01:20:39.806 [INFO][5103] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.72.0/26 handle="k8s-pod-network.c2d27de49cc1e5ff3a76a5306c6cb922bca01416daa097e131a5dfbddcc353f5" host="ci-4515-1-0-n-d76f075714" Jan 15 01:20:39.854400 containerd[1715]: 2026-01-15 01:20:39.808 [INFO][5103] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.c2d27de49cc1e5ff3a76a5306c6cb922bca01416daa097e131a5dfbddcc353f5 Jan 15 01:20:39.854400 containerd[1715]: 2026-01-15 01:20:39.811 [INFO][5103] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.72.0/26 handle="k8s-pod-network.c2d27de49cc1e5ff3a76a5306c6cb922bca01416daa097e131a5dfbddcc353f5" host="ci-4515-1-0-n-d76f075714" Jan 15 01:20:39.854400 containerd[1715]: 2026-01-15 01:20:39.822 [INFO][5103] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.72.7/26] block=192.168.72.0/26 handle="k8s-pod-network.c2d27de49cc1e5ff3a76a5306c6cb922bca01416daa097e131a5dfbddcc353f5" host="ci-4515-1-0-n-d76f075714" Jan 15 01:20:39.854400 containerd[1715]: 2026-01-15 01:20:39.822 [INFO][5103] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.72.7/26] handle="k8s-pod-network.c2d27de49cc1e5ff3a76a5306c6cb922bca01416daa097e131a5dfbddcc353f5" host="ci-4515-1-0-n-d76f075714" Jan 15 01:20:39.854400 containerd[1715]: 2026-01-15 01:20:39.822 [INFO][5103] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 15 01:20:39.854400 containerd[1715]: 2026-01-15 01:20:39.822 [INFO][5103] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.72.7/26] IPv6=[] ContainerID="c2d27de49cc1e5ff3a76a5306c6cb922bca01416daa097e131a5dfbddcc353f5" HandleID="k8s-pod-network.c2d27de49cc1e5ff3a76a5306c6cb922bca01416daa097e131a5dfbddcc353f5" Workload="ci--4515--1--0--n--d76f075714-k8s-goldmane--666569f655--qxn98-eth0" Jan 15 01:20:39.854960 containerd[1715]: 2026-01-15 01:20:39.824 [INFO][5080] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c2d27de49cc1e5ff3a76a5306c6cb922bca01416daa097e131a5dfbddcc353f5" Namespace="calico-system" Pod="goldmane-666569f655-qxn98" WorkloadEndpoint="ci--4515--1--0--n--d76f075714-k8s-goldmane--666569f655--qxn98-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--n--d76f075714-k8s-goldmane--666569f655--qxn98-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"19b9076e-57b5-41dc-b303-63eafd79e78c", ResourceVersion:"800", Generation:0, CreationTimestamp:time.Date(2026, time.January, 15, 1, 20, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-n-d76f075714", ContainerID:"", Pod:"goldmane-666569f655-qxn98", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.72.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali3a68483fd23", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 15 01:20:39.854960 containerd[1715]: 2026-01-15 01:20:39.824 [INFO][5080] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.72.7/32] ContainerID="c2d27de49cc1e5ff3a76a5306c6cb922bca01416daa097e131a5dfbddcc353f5" Namespace="calico-system" Pod="goldmane-666569f655-qxn98" WorkloadEndpoint="ci--4515--1--0--n--d76f075714-k8s-goldmane--666569f655--qxn98-eth0" Jan 15 01:20:39.854960 containerd[1715]: 2026-01-15 01:20:39.824 [INFO][5080] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali3a68483fd23 ContainerID="c2d27de49cc1e5ff3a76a5306c6cb922bca01416daa097e131a5dfbddcc353f5" Namespace="calico-system" Pod="goldmane-666569f655-qxn98" WorkloadEndpoint="ci--4515--1--0--n--d76f075714-k8s-goldmane--666569f655--qxn98-eth0" Jan 15 01:20:39.854960 containerd[1715]: 2026-01-15 01:20:39.834 [INFO][5080] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c2d27de49cc1e5ff3a76a5306c6cb922bca01416daa097e131a5dfbddcc353f5" Namespace="calico-system" Pod="goldmane-666569f655-qxn98" WorkloadEndpoint="ci--4515--1--0--n--d76f075714-k8s-goldmane--666569f655--qxn98-eth0" Jan 15 01:20:39.854960 containerd[1715]: 2026-01-15 01:20:39.835 [INFO][5080] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c2d27de49cc1e5ff3a76a5306c6cb922bca01416daa097e131a5dfbddcc353f5" Namespace="calico-system" Pod="goldmane-666569f655-qxn98" WorkloadEndpoint="ci--4515--1--0--n--d76f075714-k8s-goldmane--666569f655--qxn98-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--n--d76f075714-k8s-goldmane--666569f655--qxn98-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"19b9076e-57b5-41dc-b303-63eafd79e78c", ResourceVersion:"800", Generation:0, CreationTimestamp:time.Date(2026, time.January, 15, 1, 20, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-n-d76f075714", ContainerID:"c2d27de49cc1e5ff3a76a5306c6cb922bca01416daa097e131a5dfbddcc353f5", Pod:"goldmane-666569f655-qxn98", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.72.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali3a68483fd23", MAC:"f6:1a:3a:e0:58:40", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 15 01:20:39.854960 containerd[1715]: 2026-01-15 01:20:39.849 [INFO][5080] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c2d27de49cc1e5ff3a76a5306c6cb922bca01416daa097e131a5dfbddcc353f5" Namespace="calico-system" Pod="goldmane-666569f655-qxn98" WorkloadEndpoint="ci--4515--1--0--n--d76f075714-k8s-goldmane--666569f655--qxn98-eth0" Jan 15 01:20:39.858381 kubelet[3275]: E0115 01:20:39.858059 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5b8cdf5dcc-grnmf" podUID="fc1c990f-1003-460d-a72d-34a2a5fb4d83" Jan 15 01:20:39.858850 kubelet[3275]: E0115 01:20:39.858769 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5b8cdf5dcc-9rs84" podUID="6ebcb69d-4b71-4631-9554-a5f179cc05ba" Jan 15 01:20:39.860631 kubelet[3275]: E0115 01:20:39.860356 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-9c45d7c9c-l7rhq" podUID="0d14115d-26fb-4eac-a6b9-b5aa96406bb8" Jan 15 01:20:39.889978 containerd[1715]: time="2026-01-15T01:20:39.889732068Z" level=info msg="connecting to shim c2d27de49cc1e5ff3a76a5306c6cb922bca01416daa097e131a5dfbddcc353f5" address="unix:///run/containerd/s/ac9ed8a5926b54b9c64cf938488738969249a3910b9ae6b028b2e61ac446ce6e" namespace=k8s.io protocol=ttrpc version=3 Jan 15 01:20:39.912000 audit[5151]: NETFILTER_CFG table=filter:129 family=2 entries=62 op=nft_register_chain pid=5151 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 15 01:20:39.912000 audit[5151]: SYSCALL arch=c000003e syscall=46 success=yes exit=31564 a0=3 a1=7ffd346bab20 a2=0 a3=7ffd346bab0c items=0 ppid=4600 pid=5151 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:39.912000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 15 01:20:39.919501 kubelet[3275]: I0115 01:20:39.919403 3275 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-2sh5x" podStartSLOduration=41.919386716 podStartE2EDuration="41.919386716s" podCreationTimestamp="2026-01-15 01:19:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-15 01:20:39.917374238 +0000 UTC m=+46.340814872" watchObservedRunningTime="2026-01-15 01:20:39.919386716 +0000 UTC m=+46.342827359" Jan 15 01:20:39.933207 systemd-networkd[1599]: cali3dde3515453: Gained IPv6LL Jan 15 01:20:39.938262 systemd[1]: Started cri-containerd-c2d27de49cc1e5ff3a76a5306c6cb922bca01416daa097e131a5dfbddcc353f5.scope - libcontainer container c2d27de49cc1e5ff3a76a5306c6cb922bca01416daa097e131a5dfbddcc353f5. Jan 15 01:20:39.973000 audit: BPF prog-id=241 op=LOAD Jan 15 01:20:39.974000 audit: BPF prog-id=242 op=LOAD Jan 15 01:20:39.974000 audit[5150]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a8238 a2=98 a3=0 items=0 ppid=5138 pid=5150 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:39.974000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6332643237646534396363316535666633613736613533303663366362 Jan 15 01:20:39.974000 audit: BPF prog-id=242 op=UNLOAD Jan 15 01:20:39.974000 audit[5150]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=5138 pid=5150 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:39.974000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6332643237646534396363316535666633613736613533303663366362 Jan 15 01:20:39.974000 audit: BPF prog-id=243 op=LOAD Jan 15 01:20:39.974000 audit[5150]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a8488 a2=98 a3=0 items=0 ppid=5138 pid=5150 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:39.974000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6332643237646534396363316535666633613736613533303663366362 Jan 15 01:20:39.974000 audit: BPF prog-id=244 op=LOAD Jan 15 01:20:39.974000 audit[5150]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c0001a8218 a2=98 a3=0 items=0 ppid=5138 pid=5150 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:39.974000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6332643237646534396363316535666633613736613533303663366362 Jan 15 01:20:39.974000 audit: BPF prog-id=244 op=UNLOAD Jan 15 01:20:39.974000 audit[5150]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=5138 pid=5150 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:39.974000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6332643237646534396363316535666633613736613533303663366362 Jan 15 01:20:39.974000 audit: BPF prog-id=243 op=UNLOAD Jan 15 01:20:39.974000 audit[5150]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=5138 pid=5150 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:39.974000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6332643237646534396363316535666633613736613533303663366362 Jan 15 01:20:39.974000 audit: BPF prog-id=245 op=LOAD Jan 15 01:20:39.974000 audit[5150]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a86e8 a2=98 a3=0 items=0 ppid=5138 pid=5150 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:39.974000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6332643237646534396363316535666633613736613533303663366362 Jan 15 01:20:39.986000 audit[5171]: NETFILTER_CFG table=filter:130 family=2 entries=20 op=nft_register_rule pid=5171 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 01:20:39.986000 audit[5171]: SYSCALL arch=c000003e syscall=46 success=yes exit=7480 a0=3 a1=7fff73f104c0 a2=0 a3=7fff73f104ac items=0 ppid=3378 pid=5171 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:39.986000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 01:20:39.993000 audit[5171]: NETFILTER_CFG table=nat:131 family=2 entries=14 op=nft_register_rule pid=5171 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 01:20:39.993000 audit[5171]: SYSCALL arch=c000003e syscall=46 success=yes exit=3468 a0=3 a1=7fff73f104c0 a2=0 a3=0 items=0 ppid=3378 pid=5171 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:39.993000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 01:20:39.996481 systemd-networkd[1599]: cali978989ccb0f: Gained IPv6LL Jan 15 01:20:40.007633 systemd-networkd[1599]: calibf168551855: Link UP Jan 15 01:20:40.008416 systemd-networkd[1599]: calibf168551855: Gained carrier Jan 15 01:20:40.030000 audit[5177]: NETFILTER_CFG table=filter:132 family=2 entries=17 op=nft_register_rule pid=5177 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 01:20:40.030000 audit[5177]: SYSCALL arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7ffdf949fca0 a2=0 a3=7ffdf949fc8c items=0 ppid=3378 pid=5177 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:40.030000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 01:20:40.032000 audit[5177]: NETFILTER_CFG table=nat:133 family=2 entries=35 op=nft_register_chain pid=5177 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 01:20:40.032000 audit[5177]: SYSCALL arch=c000003e syscall=46 success=yes exit=14196 a0=3 a1=7ffdf949fca0 a2=0 a3=7ffdf949fc8c items=0 ppid=3378 pid=5177 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:40.032000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 01:20:40.036053 containerd[1715]: 2026-01-15 01:20:39.765 [INFO][5089] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4515--1--0--n--d76f075714-k8s-coredns--668d6bf9bc--7zb8r-eth0 coredns-668d6bf9bc- kube-system 80ef1833-e881-4202-8a43-efb4cbd7eee4 789 0 2026-01-15 01:19:58 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4515-1-0-n-d76f075714 coredns-668d6bf9bc-7zb8r eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calibf168551855 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="79cf64710559020bc7d8be1ba23840e1ef9a45139840807fb51d7e5b1e49dc8e" Namespace="kube-system" Pod="coredns-668d6bf9bc-7zb8r" WorkloadEndpoint="ci--4515--1--0--n--d76f075714-k8s-coredns--668d6bf9bc--7zb8r-" Jan 15 01:20:40.036053 containerd[1715]: 2026-01-15 01:20:39.765 [INFO][5089] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="79cf64710559020bc7d8be1ba23840e1ef9a45139840807fb51d7e5b1e49dc8e" Namespace="kube-system" Pod="coredns-668d6bf9bc-7zb8r" WorkloadEndpoint="ci--4515--1--0--n--d76f075714-k8s-coredns--668d6bf9bc--7zb8r-eth0" Jan 15 01:20:40.036053 containerd[1715]: 2026-01-15 01:20:39.805 [INFO][5111] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="79cf64710559020bc7d8be1ba23840e1ef9a45139840807fb51d7e5b1e49dc8e" HandleID="k8s-pod-network.79cf64710559020bc7d8be1ba23840e1ef9a45139840807fb51d7e5b1e49dc8e" Workload="ci--4515--1--0--n--d76f075714-k8s-coredns--668d6bf9bc--7zb8r-eth0" Jan 15 01:20:40.036053 containerd[1715]: 2026-01-15 01:20:39.805 [INFO][5111] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="79cf64710559020bc7d8be1ba23840e1ef9a45139840807fb51d7e5b1e49dc8e" HandleID="k8s-pod-network.79cf64710559020bc7d8be1ba23840e1ef9a45139840807fb51d7e5b1e49dc8e" Workload="ci--4515--1--0--n--d76f075714-k8s-coredns--668d6bf9bc--7zb8r-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d4fe0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4515-1-0-n-d76f075714", "pod":"coredns-668d6bf9bc-7zb8r", "timestamp":"2026-01-15 01:20:39.805678914 +0000 UTC"}, Hostname:"ci-4515-1-0-n-d76f075714", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 15 01:20:40.036053 containerd[1715]: 2026-01-15 01:20:39.805 [INFO][5111] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 15 01:20:40.036053 containerd[1715]: 2026-01-15 01:20:39.822 [INFO][5111] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 15 01:20:40.036053 containerd[1715]: 2026-01-15 01:20:39.822 [INFO][5111] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4515-1-0-n-d76f075714' Jan 15 01:20:40.036053 containerd[1715]: 2026-01-15 01:20:39.888 [INFO][5111] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.79cf64710559020bc7d8be1ba23840e1ef9a45139840807fb51d7e5b1e49dc8e" host="ci-4515-1-0-n-d76f075714" Jan 15 01:20:40.036053 containerd[1715]: 2026-01-15 01:20:39.901 [INFO][5111] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4515-1-0-n-d76f075714" Jan 15 01:20:40.036053 containerd[1715]: 2026-01-15 01:20:39.930 [INFO][5111] ipam/ipam.go 511: Trying affinity for 192.168.72.0/26 host="ci-4515-1-0-n-d76f075714" Jan 15 01:20:40.036053 containerd[1715]: 2026-01-15 01:20:39.936 [INFO][5111] ipam/ipam.go 158: Attempting to load block cidr=192.168.72.0/26 host="ci-4515-1-0-n-d76f075714" Jan 15 01:20:40.036053 containerd[1715]: 2026-01-15 01:20:39.950 [INFO][5111] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.72.0/26 host="ci-4515-1-0-n-d76f075714" Jan 15 01:20:40.036053 containerd[1715]: 2026-01-15 01:20:39.951 [INFO][5111] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.72.0/26 handle="k8s-pod-network.79cf64710559020bc7d8be1ba23840e1ef9a45139840807fb51d7e5b1e49dc8e" host="ci-4515-1-0-n-d76f075714" Jan 15 01:20:40.036053 containerd[1715]: 2026-01-15 01:20:39.958 [INFO][5111] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.79cf64710559020bc7d8be1ba23840e1ef9a45139840807fb51d7e5b1e49dc8e Jan 15 01:20:40.036053 containerd[1715]: 2026-01-15 01:20:39.974 [INFO][5111] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.72.0/26 handle="k8s-pod-network.79cf64710559020bc7d8be1ba23840e1ef9a45139840807fb51d7e5b1e49dc8e" host="ci-4515-1-0-n-d76f075714" Jan 15 01:20:40.036053 containerd[1715]: 2026-01-15 01:20:39.993 [INFO][5111] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.72.8/26] block=192.168.72.0/26 handle="k8s-pod-network.79cf64710559020bc7d8be1ba23840e1ef9a45139840807fb51d7e5b1e49dc8e" host="ci-4515-1-0-n-d76f075714" Jan 15 01:20:40.036053 containerd[1715]: 2026-01-15 01:20:39.993 [INFO][5111] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.72.8/26] handle="k8s-pod-network.79cf64710559020bc7d8be1ba23840e1ef9a45139840807fb51d7e5b1e49dc8e" host="ci-4515-1-0-n-d76f075714" Jan 15 01:20:40.036053 containerd[1715]: 2026-01-15 01:20:39.993 [INFO][5111] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 15 01:20:40.036053 containerd[1715]: 2026-01-15 01:20:39.993 [INFO][5111] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.72.8/26] IPv6=[] ContainerID="79cf64710559020bc7d8be1ba23840e1ef9a45139840807fb51d7e5b1e49dc8e" HandleID="k8s-pod-network.79cf64710559020bc7d8be1ba23840e1ef9a45139840807fb51d7e5b1e49dc8e" Workload="ci--4515--1--0--n--d76f075714-k8s-coredns--668d6bf9bc--7zb8r-eth0" Jan 15 01:20:40.036534 containerd[1715]: 2026-01-15 01:20:39.997 [INFO][5089] cni-plugin/k8s.go 418: Populated endpoint ContainerID="79cf64710559020bc7d8be1ba23840e1ef9a45139840807fb51d7e5b1e49dc8e" Namespace="kube-system" Pod="coredns-668d6bf9bc-7zb8r" WorkloadEndpoint="ci--4515--1--0--n--d76f075714-k8s-coredns--668d6bf9bc--7zb8r-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--n--d76f075714-k8s-coredns--668d6bf9bc--7zb8r-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"80ef1833-e881-4202-8a43-efb4cbd7eee4", ResourceVersion:"789", Generation:0, CreationTimestamp:time.Date(2026, time.January, 15, 1, 19, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-n-d76f075714", ContainerID:"", Pod:"coredns-668d6bf9bc-7zb8r", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.72.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calibf168551855", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 15 01:20:40.036534 containerd[1715]: 2026-01-15 01:20:39.997 [INFO][5089] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.72.8/32] ContainerID="79cf64710559020bc7d8be1ba23840e1ef9a45139840807fb51d7e5b1e49dc8e" Namespace="kube-system" Pod="coredns-668d6bf9bc-7zb8r" WorkloadEndpoint="ci--4515--1--0--n--d76f075714-k8s-coredns--668d6bf9bc--7zb8r-eth0" Jan 15 01:20:40.036534 containerd[1715]: 2026-01-15 01:20:40.000 [INFO][5089] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calibf168551855 ContainerID="79cf64710559020bc7d8be1ba23840e1ef9a45139840807fb51d7e5b1e49dc8e" Namespace="kube-system" Pod="coredns-668d6bf9bc-7zb8r" WorkloadEndpoint="ci--4515--1--0--n--d76f075714-k8s-coredns--668d6bf9bc--7zb8r-eth0" Jan 15 01:20:40.036534 containerd[1715]: 2026-01-15 01:20:40.007 [INFO][5089] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="79cf64710559020bc7d8be1ba23840e1ef9a45139840807fb51d7e5b1e49dc8e" Namespace="kube-system" Pod="coredns-668d6bf9bc-7zb8r" WorkloadEndpoint="ci--4515--1--0--n--d76f075714-k8s-coredns--668d6bf9bc--7zb8r-eth0" Jan 15 01:20:40.036534 containerd[1715]: 2026-01-15 01:20:40.011 [INFO][5089] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="79cf64710559020bc7d8be1ba23840e1ef9a45139840807fb51d7e5b1e49dc8e" Namespace="kube-system" Pod="coredns-668d6bf9bc-7zb8r" WorkloadEndpoint="ci--4515--1--0--n--d76f075714-k8s-coredns--668d6bf9bc--7zb8r-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--n--d76f075714-k8s-coredns--668d6bf9bc--7zb8r-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"80ef1833-e881-4202-8a43-efb4cbd7eee4", ResourceVersion:"789", Generation:0, CreationTimestamp:time.Date(2026, time.January, 15, 1, 19, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-n-d76f075714", ContainerID:"79cf64710559020bc7d8be1ba23840e1ef9a45139840807fb51d7e5b1e49dc8e", Pod:"coredns-668d6bf9bc-7zb8r", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.72.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calibf168551855", MAC:"c6:36:43:0a:82:ca", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 15 01:20:40.036534 containerd[1715]: 2026-01-15 01:20:40.030 [INFO][5089] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="79cf64710559020bc7d8be1ba23840e1ef9a45139840807fb51d7e5b1e49dc8e" Namespace="kube-system" Pod="coredns-668d6bf9bc-7zb8r" WorkloadEndpoint="ci--4515--1--0--n--d76f075714-k8s-coredns--668d6bf9bc--7zb8r-eth0" Jan 15 01:20:40.049787 containerd[1715]: time="2026-01-15T01:20:40.049748670Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-qxn98,Uid:19b9076e-57b5-41dc-b303-63eafd79e78c,Namespace:calico-system,Attempt:0,} returns sandbox id \"c2d27de49cc1e5ff3a76a5306c6cb922bca01416daa097e131a5dfbddcc353f5\"" Jan 15 01:20:40.052239 containerd[1715]: time="2026-01-15T01:20:40.052208669Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 15 01:20:40.059235 systemd-networkd[1599]: calidfac3594a5b: Gained IPv6LL Jan 15 01:20:40.059000 audit[5191]: NETFILTER_CFG table=filter:134 family=2 entries=44 op=nft_register_chain pid=5191 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 15 01:20:40.059000 audit[5191]: SYSCALL arch=c000003e syscall=46 success=yes exit=21484 a0=3 a1=7ffda7f2e3b0 a2=0 a3=7ffda7f2e39c items=0 ppid=4600 pid=5191 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:40.059000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 15 01:20:40.081811 containerd[1715]: time="2026-01-15T01:20:40.081721018Z" level=info msg="connecting to shim 79cf64710559020bc7d8be1ba23840e1ef9a45139840807fb51d7e5b1e49dc8e" address="unix:///run/containerd/s/e2b60962cd41497fec738a4462880e34dad0cbef0c63f8935da78dc6c15a6226" namespace=k8s.io protocol=ttrpc version=3 Jan 15 01:20:40.109225 systemd[1]: Started cri-containerd-79cf64710559020bc7d8be1ba23840e1ef9a45139840807fb51d7e5b1e49dc8e.scope - libcontainer container 79cf64710559020bc7d8be1ba23840e1ef9a45139840807fb51d7e5b1e49dc8e. Jan 15 01:20:40.120000 audit: BPF prog-id=246 op=LOAD Jan 15 01:20:40.121000 audit: BPF prog-id=247 op=LOAD Jan 15 01:20:40.121000 audit[5212]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000178238 a2=98 a3=0 items=0 ppid=5200 pid=5212 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:40.121000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3739636636343731303535393032306263376438626531626132333834 Jan 15 01:20:40.121000 audit: BPF prog-id=247 op=UNLOAD Jan 15 01:20:40.121000 audit[5212]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=5200 pid=5212 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:40.121000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3739636636343731303535393032306263376438626531626132333834 Jan 15 01:20:40.121000 audit: BPF prog-id=248 op=LOAD Jan 15 01:20:40.121000 audit[5212]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000178488 a2=98 a3=0 items=0 ppid=5200 pid=5212 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:40.121000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3739636636343731303535393032306263376438626531626132333834 Jan 15 01:20:40.121000 audit: BPF prog-id=249 op=LOAD Jan 15 01:20:40.121000 audit[5212]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000178218 a2=98 a3=0 items=0 ppid=5200 pid=5212 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:40.121000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3739636636343731303535393032306263376438626531626132333834 Jan 15 01:20:40.122000 audit: BPF prog-id=249 op=UNLOAD Jan 15 01:20:40.122000 audit[5212]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=5200 pid=5212 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:40.122000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3739636636343731303535393032306263376438626531626132333834 Jan 15 01:20:40.122000 audit: BPF prog-id=248 op=UNLOAD Jan 15 01:20:40.122000 audit[5212]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=5200 pid=5212 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:40.122000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3739636636343731303535393032306263376438626531626132333834 Jan 15 01:20:40.122000 audit: BPF prog-id=250 op=LOAD Jan 15 01:20:40.122000 audit[5212]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001786e8 a2=98 a3=0 items=0 ppid=5200 pid=5212 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:40.122000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3739636636343731303535393032306263376438626531626132333834 Jan 15 01:20:40.157799 containerd[1715]: time="2026-01-15T01:20:40.157769581Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-7zb8r,Uid:80ef1833-e881-4202-8a43-efb4cbd7eee4,Namespace:kube-system,Attempt:0,} returns sandbox id \"79cf64710559020bc7d8be1ba23840e1ef9a45139840807fb51d7e5b1e49dc8e\"" Jan 15 01:20:40.161184 containerd[1715]: time="2026-01-15T01:20:40.161152166Z" level=info msg="CreateContainer within sandbox \"79cf64710559020bc7d8be1ba23840e1ef9a45139840807fb51d7e5b1e49dc8e\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 15 01:20:40.171998 containerd[1715]: time="2026-01-15T01:20:40.171928641Z" level=info msg="Container eea86877f924b4a6d7896f35cfde4761534d06b8ef281350c8bbb5f7183a4b41: CDI devices from CRI Config.CDIDevices: []" Jan 15 01:20:40.178039 containerd[1715]: time="2026-01-15T01:20:40.177985938Z" level=info msg="CreateContainer within sandbox \"79cf64710559020bc7d8be1ba23840e1ef9a45139840807fb51d7e5b1e49dc8e\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"eea86877f924b4a6d7896f35cfde4761534d06b8ef281350c8bbb5f7183a4b41\"" Jan 15 01:20:40.179577 containerd[1715]: time="2026-01-15T01:20:40.178823382Z" level=info msg="StartContainer for \"eea86877f924b4a6d7896f35cfde4761534d06b8ef281350c8bbb5f7183a4b41\"" Jan 15 01:20:40.179763 containerd[1715]: time="2026-01-15T01:20:40.179746801Z" level=info msg="connecting to shim eea86877f924b4a6d7896f35cfde4761534d06b8ef281350c8bbb5f7183a4b41" address="unix:///run/containerd/s/e2b60962cd41497fec738a4462880e34dad0cbef0c63f8935da78dc6c15a6226" protocol=ttrpc version=3 Jan 15 01:20:40.189328 systemd-networkd[1599]: calia1efb9fd50f: Gained IPv6LL Jan 15 01:20:40.205256 systemd[1]: Started cri-containerd-eea86877f924b4a6d7896f35cfde4761534d06b8ef281350c8bbb5f7183a4b41.scope - libcontainer container eea86877f924b4a6d7896f35cfde4761534d06b8ef281350c8bbb5f7183a4b41. Jan 15 01:20:40.221000 audit: BPF prog-id=251 op=LOAD Jan 15 01:20:40.221000 audit: BPF prog-id=252 op=LOAD Jan 15 01:20:40.221000 audit[5239]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=5200 pid=5239 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:40.221000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6565613836383737663932346234613664373839366633356366646534 Jan 15 01:20:40.221000 audit: BPF prog-id=252 op=UNLOAD Jan 15 01:20:40.221000 audit[5239]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=5200 pid=5239 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:40.221000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6565613836383737663932346234613664373839366633356366646534 Jan 15 01:20:40.221000 audit: BPF prog-id=253 op=LOAD Jan 15 01:20:40.221000 audit[5239]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=5200 pid=5239 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:40.221000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6565613836383737663932346234613664373839366633356366646534 Jan 15 01:20:40.222000 audit: BPF prog-id=254 op=LOAD Jan 15 01:20:40.222000 audit[5239]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=5200 pid=5239 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:40.222000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6565613836383737663932346234613664373839366633356366646534 Jan 15 01:20:40.222000 audit: BPF prog-id=254 op=UNLOAD Jan 15 01:20:40.222000 audit[5239]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=5200 pid=5239 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:40.222000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6565613836383737663932346234613664373839366633356366646534 Jan 15 01:20:40.222000 audit: BPF prog-id=253 op=UNLOAD Jan 15 01:20:40.222000 audit[5239]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=5200 pid=5239 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:40.222000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6565613836383737663932346234613664373839366633356366646534 Jan 15 01:20:40.222000 audit: BPF prog-id=255 op=LOAD Jan 15 01:20:40.222000 audit[5239]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=5200 pid=5239 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:40.222000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6565613836383737663932346234613664373839366633356366646534 Jan 15 01:20:40.241275 containerd[1715]: time="2026-01-15T01:20:40.241202643Z" level=info msg="StartContainer for \"eea86877f924b4a6d7896f35cfde4761534d06b8ef281350c8bbb5f7183a4b41\" returns successfully" Jan 15 01:20:40.394221 containerd[1715]: time="2026-01-15T01:20:40.393989451Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 01:20:40.395547 containerd[1715]: time="2026-01-15T01:20:40.395458143Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 15 01:20:40.395547 containerd[1715]: time="2026-01-15T01:20:40.395511960Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 15 01:20:40.395781 kubelet[3275]: E0115 01:20:40.395734 3275 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 15 01:20:40.396287 kubelet[3275]: E0115 01:20:40.395796 3275 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 15 01:20:40.396287 kubelet[3275]: E0115 01:20:40.395961 3275 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bxqh6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-qxn98_calico-system(19b9076e-57b5-41dc-b303-63eafd79e78c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 15 01:20:40.397595 kubelet[3275]: E0115 01:20:40.397536 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qxn98" podUID="19b9076e-57b5-41dc-b303-63eafd79e78c" Jan 15 01:20:40.864049 kubelet[3275]: E0115 01:20:40.863343 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qxn98" podUID="19b9076e-57b5-41dc-b303-63eafd79e78c" Jan 15 01:20:40.918000 audit[5272]: NETFILTER_CFG table=filter:135 family=2 entries=14 op=nft_register_rule pid=5272 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 01:20:40.918000 audit[5272]: SYSCALL arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7fffcaa37250 a2=0 a3=7fffcaa3723c items=0 ppid=3378 pid=5272 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:40.918000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 01:20:40.943000 audit[5272]: NETFILTER_CFG table=nat:136 family=2 entries=56 op=nft_register_chain pid=5272 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 01:20:40.943000 audit[5272]: SYSCALL arch=c000003e syscall=46 success=yes exit=19860 a0=3 a1=7fffcaa37250 a2=0 a3=7fffcaa3723c items=0 ppid=3378 pid=5272 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:20:40.943000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 01:20:41.531213 systemd-networkd[1599]: cali3a68483fd23: Gained IPv6LL Jan 15 01:20:41.724191 systemd-networkd[1599]: calibf168551855: Gained IPv6LL Jan 15 01:20:41.868781 kubelet[3275]: E0115 01:20:41.868675 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qxn98" podUID="19b9076e-57b5-41dc-b303-63eafd79e78c" Jan 15 01:20:41.884136 kubelet[3275]: I0115 01:20:41.883992 3275 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-7zb8r" podStartSLOduration=43.883974884 podStartE2EDuration="43.883974884s" podCreationTimestamp="2026-01-15 01:19:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-15 01:20:40.892672303 +0000 UTC m=+47.316112939" watchObservedRunningTime="2026-01-15 01:20:41.883974884 +0000 UTC m=+48.307415525" Jan 15 01:20:50.692149 containerd[1715]: time="2026-01-15T01:20:50.691983240Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 15 01:20:51.021689 containerd[1715]: time="2026-01-15T01:20:51.021564472Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 01:20:51.024775 containerd[1715]: time="2026-01-15T01:20:51.024678398Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 15 01:20:51.024775 containerd[1715]: time="2026-01-15T01:20:51.024799872Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 15 01:20:51.025676 kubelet[3275]: E0115 01:20:51.025253 3275 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 01:20:51.025676 kubelet[3275]: E0115 01:20:51.025324 3275 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 01:20:51.026391 kubelet[3275]: E0115 01:20:51.025733 3275 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kcc99,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-5b8cdf5dcc-9rs84_calico-apiserver(6ebcb69d-4b71-4631-9554-a5f179cc05ba): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 15 01:20:51.026548 containerd[1715]: time="2026-01-15T01:20:51.025981378Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 15 01:20:51.027236 kubelet[3275]: E0115 01:20:51.026896 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5b8cdf5dcc-9rs84" podUID="6ebcb69d-4b71-4631-9554-a5f179cc05ba" Jan 15 01:20:51.371864 containerd[1715]: time="2026-01-15T01:20:51.371651952Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 01:20:51.373026 containerd[1715]: time="2026-01-15T01:20:51.372945798Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 15 01:20:51.373154 containerd[1715]: time="2026-01-15T01:20:51.373126159Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 15 01:20:51.373379 kubelet[3275]: E0115 01:20:51.373321 3275 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 15 01:20:51.373379 kubelet[3275]: E0115 01:20:51.373365 3275 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 15 01:20:51.373981 kubelet[3275]: E0115 01:20:51.373579 3275 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:8086b71984894a589b5b3e200fb6432c,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2z9kn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7f47b45b95-r7r79_calico-system(20583031-73a3-4ec4-aced-e96f0a2ba67b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 15 01:20:51.381084 containerd[1715]: time="2026-01-15T01:20:51.380911489Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 15 01:20:51.730524 containerd[1715]: time="2026-01-15T01:20:51.730485515Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 01:20:51.732159 containerd[1715]: time="2026-01-15T01:20:51.732064342Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 15 01:20:51.732313 containerd[1715]: time="2026-01-15T01:20:51.732249718Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 15 01:20:51.732481 kubelet[3275]: E0115 01:20:51.732440 3275 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 15 01:20:51.732533 kubelet[3275]: E0115 01:20:51.732493 3275 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 15 01:20:51.732812 containerd[1715]: time="2026-01-15T01:20:51.732755587Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 15 01:20:51.732968 kubelet[3275]: E0115 01:20:51.732908 3275 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2z9kn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7f47b45b95-r7r79_calico-system(20583031-73a3-4ec4-aced-e96f0a2ba67b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 15 01:20:51.734776 kubelet[3275]: E0115 01:20:51.734439 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7f47b45b95-r7r79" podUID="20583031-73a3-4ec4-aced-e96f0a2ba67b" Jan 15 01:20:52.061863 containerd[1715]: time="2026-01-15T01:20:52.061704829Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 01:20:52.063355 containerd[1715]: time="2026-01-15T01:20:52.063305109Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 15 01:20:52.063457 containerd[1715]: time="2026-01-15T01:20:52.063401701Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 15 01:20:52.063701 kubelet[3275]: E0115 01:20:52.063642 3275 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 15 01:20:52.064100 kubelet[3275]: E0115 01:20:52.063705 3275 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 15 01:20:52.064100 kubelet[3275]: E0115 01:20:52.063850 3275 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6lc26,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-srjh9_calico-system(ddb26c79-6272-4ee5-ba41-ad8ec552e6c6): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 15 01:20:52.066616 containerd[1715]: time="2026-01-15T01:20:52.066379787Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 15 01:20:52.395353 containerd[1715]: time="2026-01-15T01:20:52.395212114Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 01:20:52.397178 containerd[1715]: time="2026-01-15T01:20:52.397030948Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 15 01:20:52.397178 containerd[1715]: time="2026-01-15T01:20:52.397075123Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 15 01:20:52.397750 kubelet[3275]: E0115 01:20:52.397371 3275 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 15 01:20:52.397750 kubelet[3275]: E0115 01:20:52.397445 3275 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 15 01:20:52.398560 kubelet[3275]: E0115 01:20:52.398499 3275 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6lc26,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-srjh9_calico-system(ddb26c79-6272-4ee5-ba41-ad8ec552e6c6): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 15 01:20:52.399907 kubelet[3275]: E0115 01:20:52.399856 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-srjh9" podUID="ddb26c79-6272-4ee5-ba41-ad8ec552e6c6" Jan 15 01:20:53.693980 containerd[1715]: time="2026-01-15T01:20:53.693419347Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 15 01:20:54.018313 containerd[1715]: time="2026-01-15T01:20:54.018100006Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 01:20:54.020662 containerd[1715]: time="2026-01-15T01:20:54.020565163Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 15 01:20:54.020732 containerd[1715]: time="2026-01-15T01:20:54.020653485Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 15 01:20:54.020818 kubelet[3275]: E0115 01:20:54.020782 3275 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 01:20:54.021196 kubelet[3275]: E0115 01:20:54.020828 3275 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 01:20:54.021196 kubelet[3275]: E0115 01:20:54.020965 3275 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mmtnw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-5b8cdf5dcc-grnmf_calico-apiserver(fc1c990f-1003-460d-a72d-34a2a5fb4d83): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 15 01:20:54.022436 kubelet[3275]: E0115 01:20:54.022403 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5b8cdf5dcc-grnmf" podUID="fc1c990f-1003-460d-a72d-34a2a5fb4d83" Jan 15 01:20:54.692809 containerd[1715]: time="2026-01-15T01:20:54.692717284Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 15 01:20:55.002290 containerd[1715]: time="2026-01-15T01:20:55.002156136Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 01:20:55.007281 containerd[1715]: time="2026-01-15T01:20:55.007233936Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 15 01:20:55.007376 containerd[1715]: time="2026-01-15T01:20:55.007315560Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 15 01:20:55.007586 kubelet[3275]: E0115 01:20:55.007516 3275 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 15 01:20:55.007586 kubelet[3275]: E0115 01:20:55.007564 3275 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 15 01:20:55.007977 containerd[1715]: time="2026-01-15T01:20:55.007932539Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 15 01:20:55.008360 kubelet[3275]: E0115 01:20:55.007847 3275 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hxp5p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-9c45d7c9c-l7rhq_calico-system(0d14115d-26fb-4eac-a6b9-b5aa96406bb8): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 15 01:20:55.009412 kubelet[3275]: E0115 01:20:55.009384 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-9c45d7c9c-l7rhq" podUID="0d14115d-26fb-4eac-a6b9-b5aa96406bb8" Jan 15 01:20:55.346520 containerd[1715]: time="2026-01-15T01:20:55.346397436Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 01:20:55.347860 containerd[1715]: time="2026-01-15T01:20:55.347795931Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 15 01:20:55.348044 containerd[1715]: time="2026-01-15T01:20:55.347881416Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 15 01:20:55.348133 kubelet[3275]: E0115 01:20:55.348051 3275 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 15 01:20:55.348133 kubelet[3275]: E0115 01:20:55.348102 3275 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 15 01:20:55.348574 kubelet[3275]: E0115 01:20:55.348218 3275 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bxqh6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-qxn98_calico-system(19b9076e-57b5-41dc-b303-63eafd79e78c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 15 01:20:55.349381 kubelet[3275]: E0115 01:20:55.349343 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qxn98" podUID="19b9076e-57b5-41dc-b303-63eafd79e78c" Jan 15 01:21:04.693445 kubelet[3275]: E0115 01:21:04.693400 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-srjh9" podUID="ddb26c79-6272-4ee5-ba41-ad8ec552e6c6" Jan 15 01:21:05.694778 kubelet[3275]: E0115 01:21:05.694342 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5b8cdf5dcc-9rs84" podUID="6ebcb69d-4b71-4631-9554-a5f179cc05ba" Jan 15 01:21:05.695293 kubelet[3275]: E0115 01:21:05.695266 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7f47b45b95-r7r79" podUID="20583031-73a3-4ec4-aced-e96f0a2ba67b" Jan 15 01:21:07.698778 kubelet[3275]: E0115 01:21:07.698714 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5b8cdf5dcc-grnmf" podUID="fc1c990f-1003-460d-a72d-34a2a5fb4d83" Jan 15 01:21:08.692806 kubelet[3275]: E0115 01:21:08.692773 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qxn98" podUID="19b9076e-57b5-41dc-b303-63eafd79e78c" Jan 15 01:21:09.694107 kubelet[3275]: E0115 01:21:09.693498 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-9c45d7c9c-l7rhq" podUID="0d14115d-26fb-4eac-a6b9-b5aa96406bb8" Jan 15 01:21:18.692701 containerd[1715]: time="2026-01-15T01:21:18.692657946Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 15 01:21:19.037568 containerd[1715]: time="2026-01-15T01:21:19.037308174Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 01:21:19.039424 containerd[1715]: time="2026-01-15T01:21:19.039375091Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 15 01:21:19.039731 containerd[1715]: time="2026-01-15T01:21:19.039530270Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 15 01:21:19.039958 kubelet[3275]: E0115 01:21:19.039898 3275 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 01:21:19.040773 kubelet[3275]: E0115 01:21:19.040053 3275 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 01:21:19.040874 kubelet[3275]: E0115 01:21:19.040837 3275 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kcc99,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-5b8cdf5dcc-9rs84_calico-apiserver(6ebcb69d-4b71-4631-9554-a5f179cc05ba): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 15 01:21:19.041707 containerd[1715]: time="2026-01-15T01:21:19.041524522Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 15 01:21:19.042551 kubelet[3275]: E0115 01:21:19.042502 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5b8cdf5dcc-9rs84" podUID="6ebcb69d-4b71-4631-9554-a5f179cc05ba" Jan 15 01:21:19.376080 containerd[1715]: time="2026-01-15T01:21:19.375911991Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 01:21:19.377509 containerd[1715]: time="2026-01-15T01:21:19.377454740Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 15 01:21:19.377734 containerd[1715]: time="2026-01-15T01:21:19.377585760Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 15 01:21:19.377804 kubelet[3275]: E0115 01:21:19.377745 3275 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 15 01:21:19.377874 kubelet[3275]: E0115 01:21:19.377811 3275 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 15 01:21:19.378035 kubelet[3275]: E0115 01:21:19.377943 3275 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:8086b71984894a589b5b3e200fb6432c,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2z9kn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7f47b45b95-r7r79_calico-system(20583031-73a3-4ec4-aced-e96f0a2ba67b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 15 01:21:19.380784 containerd[1715]: time="2026-01-15T01:21:19.380747736Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 15 01:21:19.702427 containerd[1715]: time="2026-01-15T01:21:19.702270472Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 01:21:19.707374 containerd[1715]: time="2026-01-15T01:21:19.707270093Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 15 01:21:19.707685 containerd[1715]: time="2026-01-15T01:21:19.707464601Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 15 01:21:19.707756 kubelet[3275]: E0115 01:21:19.707698 3275 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 15 01:21:19.707756 kubelet[3275]: E0115 01:21:19.707739 3275 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 15 01:21:19.708040 containerd[1715]: time="2026-01-15T01:21:19.707990736Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 15 01:21:19.708197 kubelet[3275]: E0115 01:21:19.708148 3275 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2z9kn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7f47b45b95-r7r79_calico-system(20583031-73a3-4ec4-aced-e96f0a2ba67b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 15 01:21:19.709559 kubelet[3275]: E0115 01:21:19.709419 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7f47b45b95-r7r79" podUID="20583031-73a3-4ec4-aced-e96f0a2ba67b" Jan 15 01:21:20.022367 containerd[1715]: time="2026-01-15T01:21:20.022152940Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 01:21:20.024408 containerd[1715]: time="2026-01-15T01:21:20.023336951Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 15 01:21:20.024408 containerd[1715]: time="2026-01-15T01:21:20.023382564Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 15 01:21:20.024722 kubelet[3275]: E0115 01:21:20.024676 3275 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 15 01:21:20.024769 kubelet[3275]: E0115 01:21:20.024733 3275 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 15 01:21:20.025082 kubelet[3275]: E0115 01:21:20.025034 3275 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6lc26,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-srjh9_calico-system(ddb26c79-6272-4ee5-ba41-ad8ec552e6c6): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 15 01:21:20.025189 containerd[1715]: time="2026-01-15T01:21:20.025116365Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 15 01:21:20.366457 containerd[1715]: time="2026-01-15T01:21:20.366236886Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 01:21:20.367679 containerd[1715]: time="2026-01-15T01:21:20.367547948Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 15 01:21:20.367679 containerd[1715]: time="2026-01-15T01:21:20.367646448Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 15 01:21:20.367846 kubelet[3275]: E0115 01:21:20.367805 3275 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 15 01:21:20.368177 kubelet[3275]: E0115 01:21:20.367864 3275 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 15 01:21:20.369112 kubelet[3275]: E0115 01:21:20.368697 3275 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bxqh6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-qxn98_calico-system(19b9076e-57b5-41dc-b303-63eafd79e78c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 15 01:21:20.369304 containerd[1715]: time="2026-01-15T01:21:20.368822742Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 15 01:21:20.369833 kubelet[3275]: E0115 01:21:20.369800 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qxn98" podUID="19b9076e-57b5-41dc-b303-63eafd79e78c" Jan 15 01:21:20.700273 containerd[1715]: time="2026-01-15T01:21:20.700236119Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 01:21:20.701692 containerd[1715]: time="2026-01-15T01:21:20.701647043Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 15 01:21:20.701786 containerd[1715]: time="2026-01-15T01:21:20.701732930Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 15 01:21:20.703282 kubelet[3275]: E0115 01:21:20.702089 3275 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 15 01:21:20.703282 kubelet[3275]: E0115 01:21:20.702128 3275 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 15 01:21:20.703282 kubelet[3275]: E0115 01:21:20.702300 3275 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6lc26,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-srjh9_calico-system(ddb26c79-6272-4ee5-ba41-ad8ec552e6c6): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 15 01:21:20.703477 containerd[1715]: time="2026-01-15T01:21:20.703097555Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 15 01:21:20.704219 kubelet[3275]: E0115 01:21:20.704183 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-srjh9" podUID="ddb26c79-6272-4ee5-ba41-ad8ec552e6c6" Jan 15 01:21:21.033585 containerd[1715]: time="2026-01-15T01:21:21.033301998Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 01:21:21.034771 containerd[1715]: time="2026-01-15T01:21:21.034728763Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 15 01:21:21.034862 containerd[1715]: time="2026-01-15T01:21:21.034804978Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 15 01:21:21.035115 kubelet[3275]: E0115 01:21:21.035070 3275 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 01:21:21.035254 kubelet[3275]: E0115 01:21:21.035199 3275 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 01:21:21.035419 kubelet[3275]: E0115 01:21:21.035384 3275 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mmtnw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-5b8cdf5dcc-grnmf_calico-apiserver(fc1c990f-1003-460d-a72d-34a2a5fb4d83): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 15 01:21:21.036566 kubelet[3275]: E0115 01:21:21.036530 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5b8cdf5dcc-grnmf" podUID="fc1c990f-1003-460d-a72d-34a2a5fb4d83" Jan 15 01:21:23.695025 containerd[1715]: time="2026-01-15T01:21:23.694941035Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 15 01:21:24.024107 containerd[1715]: time="2026-01-15T01:21:24.023799250Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 01:21:24.026026 containerd[1715]: time="2026-01-15T01:21:24.025128933Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 15 01:21:24.026911 kubelet[3275]: E0115 01:21:24.026326 3275 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 15 01:21:24.026911 kubelet[3275]: E0115 01:21:24.026382 3275 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 15 01:21:24.026911 kubelet[3275]: E0115 01:21:24.026533 3275 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hxp5p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-9c45d7c9c-l7rhq_calico-system(0d14115d-26fb-4eac-a6b9-b5aa96406bb8): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 15 01:21:24.027417 containerd[1715]: time="2026-01-15T01:21:24.025239039Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 15 01:21:24.027940 kubelet[3275]: E0115 01:21:24.027638 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-9c45d7c9c-l7rhq" podUID="0d14115d-26fb-4eac-a6b9-b5aa96406bb8" Jan 15 01:21:30.693322 kubelet[3275]: E0115 01:21:30.692703 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5b8cdf5dcc-9rs84" podUID="6ebcb69d-4b71-4631-9554-a5f179cc05ba" Jan 15 01:21:31.692499 kubelet[3275]: E0115 01:21:31.692450 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qxn98" podUID="19b9076e-57b5-41dc-b303-63eafd79e78c" Jan 15 01:21:31.694594 kubelet[3275]: E0115 01:21:31.694563 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-srjh9" podUID="ddb26c79-6272-4ee5-ba41-ad8ec552e6c6" Jan 15 01:21:32.691881 kubelet[3275]: E0115 01:21:32.691842 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5b8cdf5dcc-grnmf" podUID="fc1c990f-1003-460d-a72d-34a2a5fb4d83" Jan 15 01:21:34.693331 kubelet[3275]: E0115 01:21:34.693288 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7f47b45b95-r7r79" podUID="20583031-73a3-4ec4-aced-e96f0a2ba67b" Jan 15 01:21:36.692602 kubelet[3275]: E0115 01:21:36.692450 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-9c45d7c9c-l7rhq" podUID="0d14115d-26fb-4eac-a6b9-b5aa96406bb8" Jan 15 01:21:43.698085 kubelet[3275]: E0115 01:21:43.697528 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5b8cdf5dcc-grnmf" podUID="fc1c990f-1003-460d-a72d-34a2a5fb4d83" Jan 15 01:21:44.693680 kubelet[3275]: E0115 01:21:44.693639 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-srjh9" podUID="ddb26c79-6272-4ee5-ba41-ad8ec552e6c6" Jan 15 01:21:45.692657 kubelet[3275]: E0115 01:21:45.692467 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qxn98" podUID="19b9076e-57b5-41dc-b303-63eafd79e78c" Jan 15 01:21:45.694104 kubelet[3275]: E0115 01:21:45.693085 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7f47b45b95-r7r79" podUID="20583031-73a3-4ec4-aced-e96f0a2ba67b" Jan 15 01:21:45.694377 kubelet[3275]: E0115 01:21:45.694316 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5b8cdf5dcc-9rs84" podUID="6ebcb69d-4b71-4631-9554-a5f179cc05ba" Jan 15 01:21:49.696478 kubelet[3275]: E0115 01:21:49.696440 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-9c45d7c9c-l7rhq" podUID="0d14115d-26fb-4eac-a6b9-b5aa96406bb8" Jan 15 01:21:56.693799 kubelet[3275]: E0115 01:21:56.693410 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5b8cdf5dcc-grnmf" podUID="fc1c990f-1003-460d-a72d-34a2a5fb4d83" Jan 15 01:21:58.693290 kubelet[3275]: E0115 01:21:58.692790 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5b8cdf5dcc-9rs84" podUID="6ebcb69d-4b71-4631-9554-a5f179cc05ba" Jan 15 01:21:58.693683 kubelet[3275]: E0115 01:21:58.693290 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qxn98" podUID="19b9076e-57b5-41dc-b303-63eafd79e78c" Jan 15 01:21:58.694237 kubelet[3275]: E0115 01:21:58.693950 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7f47b45b95-r7r79" podUID="20583031-73a3-4ec4-aced-e96f0a2ba67b" Jan 15 01:21:58.694375 kubelet[3275]: E0115 01:21:58.694267 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-srjh9" podUID="ddb26c79-6272-4ee5-ba41-ad8ec552e6c6" Jan 15 01:22:01.693718 kubelet[3275]: E0115 01:22:01.692540 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-9c45d7c9c-l7rhq" podUID="0d14115d-26fb-4eac-a6b9-b5aa96406bb8" Jan 15 01:22:07.693666 containerd[1715]: time="2026-01-15T01:22:07.693600758Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 15 01:22:08.253857 containerd[1715]: time="2026-01-15T01:22:08.253817370Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 01:22:08.254950 containerd[1715]: time="2026-01-15T01:22:08.254886007Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 15 01:22:08.254950 containerd[1715]: time="2026-01-15T01:22:08.254918407Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 15 01:22:08.255201 kubelet[3275]: E0115 01:22:08.255169 3275 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 01:22:08.255808 kubelet[3275]: E0115 01:22:08.255214 3275 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 01:22:08.255808 kubelet[3275]: E0115 01:22:08.255332 3275 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mmtnw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-5b8cdf5dcc-grnmf_calico-apiserver(fc1c990f-1003-460d-a72d-34a2a5fb4d83): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 15 01:22:08.257119 kubelet[3275]: E0115 01:22:08.257070 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5b8cdf5dcc-grnmf" podUID="fc1c990f-1003-460d-a72d-34a2a5fb4d83" Jan 15 01:22:09.696120 containerd[1715]: time="2026-01-15T01:22:09.696064401Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 15 01:22:10.178897 containerd[1715]: time="2026-01-15T01:22:10.178598776Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 01:22:10.181400 containerd[1715]: time="2026-01-15T01:22:10.181363724Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 15 01:22:10.181477 containerd[1715]: time="2026-01-15T01:22:10.181447789Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 15 01:22:10.181627 kubelet[3275]: E0115 01:22:10.181590 3275 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 15 01:22:10.181867 kubelet[3275]: E0115 01:22:10.181632 3275 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 15 01:22:10.181867 kubelet[3275]: E0115 01:22:10.181812 3275 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6lc26,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-srjh9_calico-system(ddb26c79-6272-4ee5-ba41-ad8ec552e6c6): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 15 01:22:10.182684 containerd[1715]: time="2026-01-15T01:22:10.182664995Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 15 01:22:10.683227 containerd[1715]: time="2026-01-15T01:22:10.683161423Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 01:22:10.684493 containerd[1715]: time="2026-01-15T01:22:10.684423221Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 15 01:22:10.684493 containerd[1715]: time="2026-01-15T01:22:10.684459880Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 15 01:22:10.684763 kubelet[3275]: E0115 01:22:10.684696 3275 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 01:22:10.684814 kubelet[3275]: E0115 01:22:10.684776 3275 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 01:22:10.685104 kubelet[3275]: E0115 01:22:10.685066 3275 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kcc99,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-5b8cdf5dcc-9rs84_calico-apiserver(6ebcb69d-4b71-4631-9554-a5f179cc05ba): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 15 01:22:10.685846 containerd[1715]: time="2026-01-15T01:22:10.685823856Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 15 01:22:10.686528 kubelet[3275]: E0115 01:22:10.686489 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5b8cdf5dcc-9rs84" podUID="6ebcb69d-4b71-4631-9554-a5f179cc05ba" Jan 15 01:22:11.198558 containerd[1715]: time="2026-01-15T01:22:11.197959271Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 01:22:11.200126 containerd[1715]: time="2026-01-15T01:22:11.200025460Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 15 01:22:11.200126 containerd[1715]: time="2026-01-15T01:22:11.200057179Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 15 01:22:11.201332 kubelet[3275]: E0115 01:22:11.201132 3275 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 15 01:22:11.201332 kubelet[3275]: E0115 01:22:11.201190 3275 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 15 01:22:11.201332 kubelet[3275]: E0115 01:22:11.201301 3275 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6lc26,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-srjh9_calico-system(ddb26c79-6272-4ee5-ba41-ad8ec552e6c6): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 15 01:22:11.202815 kubelet[3275]: E0115 01:22:11.202778 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-srjh9" podUID="ddb26c79-6272-4ee5-ba41-ad8ec552e6c6" Jan 15 01:22:11.693333 containerd[1715]: time="2026-01-15T01:22:11.692751591Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 15 01:22:12.205857 containerd[1715]: time="2026-01-15T01:22:12.205784664Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 01:22:12.207349 containerd[1715]: time="2026-01-15T01:22:12.207274050Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 15 01:22:12.207410 containerd[1715]: time="2026-01-15T01:22:12.207342710Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 15 01:22:12.207580 kubelet[3275]: E0115 01:22:12.207555 3275 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 15 01:22:12.209331 kubelet[3275]: E0115 01:22:12.207823 3275 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 15 01:22:12.209331 kubelet[3275]: E0115 01:22:12.207926 3275 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:8086b71984894a589b5b3e200fb6432c,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2z9kn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7f47b45b95-r7r79_calico-system(20583031-73a3-4ec4-aced-e96f0a2ba67b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 15 01:22:12.210031 containerd[1715]: time="2026-01-15T01:22:12.209951605Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 15 01:22:12.727520 containerd[1715]: time="2026-01-15T01:22:12.727465591Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 01:22:12.731412 containerd[1715]: time="2026-01-15T01:22:12.731360261Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 15 01:22:12.731492 containerd[1715]: time="2026-01-15T01:22:12.731448886Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 15 01:22:12.732127 kubelet[3275]: E0115 01:22:12.731607 3275 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 15 01:22:12.732127 kubelet[3275]: E0115 01:22:12.731652 3275 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 15 01:22:12.732127 kubelet[3275]: E0115 01:22:12.731764 3275 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2z9kn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7f47b45b95-r7r79_calico-system(20583031-73a3-4ec4-aced-e96f0a2ba67b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 15 01:22:12.732861 kubelet[3275]: E0115 01:22:12.732832 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7f47b45b95-r7r79" podUID="20583031-73a3-4ec4-aced-e96f0a2ba67b" Jan 15 01:22:13.694736 containerd[1715]: time="2026-01-15T01:22:13.694430906Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 15 01:22:14.206806 containerd[1715]: time="2026-01-15T01:22:14.206638949Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 01:22:14.207829 containerd[1715]: time="2026-01-15T01:22:14.207698577Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 15 01:22:14.208356 containerd[1715]: time="2026-01-15T01:22:14.207749234Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 15 01:22:14.208541 kubelet[3275]: E0115 01:22:14.208504 3275 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 15 01:22:14.208799 kubelet[3275]: E0115 01:22:14.208553 3275 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 15 01:22:14.208799 kubelet[3275]: E0115 01:22:14.208742 3275 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bxqh6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-qxn98_calico-system(19b9076e-57b5-41dc-b303-63eafd79e78c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 15 01:22:14.209668 containerd[1715]: time="2026-01-15T01:22:14.209631880Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 15 01:22:14.210173 kubelet[3275]: E0115 01:22:14.210130 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qxn98" podUID="19b9076e-57b5-41dc-b303-63eafd79e78c" Jan 15 01:22:14.718529 containerd[1715]: time="2026-01-15T01:22:14.718376785Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 01:22:14.719960 containerd[1715]: time="2026-01-15T01:22:14.719862268Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 15 01:22:14.719960 containerd[1715]: time="2026-01-15T01:22:14.719934887Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 15 01:22:14.721113 kubelet[3275]: E0115 01:22:14.720117 3275 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 15 01:22:14.721113 kubelet[3275]: E0115 01:22:14.720161 3275 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 15 01:22:14.721113 kubelet[3275]: E0115 01:22:14.720470 3275 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hxp5p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-9c45d7c9c-l7rhq_calico-system(0d14115d-26fb-4eac-a6b9-b5aa96406bb8): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 15 01:22:14.722222 kubelet[3275]: E0115 01:22:14.722197 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-9c45d7c9c-l7rhq" podUID="0d14115d-26fb-4eac-a6b9-b5aa96406bb8" Jan 15 01:22:21.695615 kubelet[3275]: E0115 01:22:21.695574 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5b8cdf5dcc-grnmf" podUID="fc1c990f-1003-460d-a72d-34a2a5fb4d83" Jan 15 01:22:23.693347 kubelet[3275]: E0115 01:22:23.693282 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-srjh9" podUID="ddb26c79-6272-4ee5-ba41-ad8ec552e6c6" Jan 15 01:22:23.694338 kubelet[3275]: E0115 01:22:23.693456 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7f47b45b95-r7r79" podUID="20583031-73a3-4ec4-aced-e96f0a2ba67b" Jan 15 01:22:24.691373 kubelet[3275]: E0115 01:22:24.691337 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5b8cdf5dcc-9rs84" podUID="6ebcb69d-4b71-4631-9554-a5f179cc05ba" Jan 15 01:22:29.693064 kubelet[3275]: E0115 01:22:29.692004 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qxn98" podUID="19b9076e-57b5-41dc-b303-63eafd79e78c" Jan 15 01:22:29.693064 kubelet[3275]: E0115 01:22:29.693064 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-9c45d7c9c-l7rhq" podUID="0d14115d-26fb-4eac-a6b9-b5aa96406bb8" Jan 15 01:22:34.693310 kubelet[3275]: E0115 01:22:34.693248 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-srjh9" podUID="ddb26c79-6272-4ee5-ba41-ad8ec552e6c6" Jan 15 01:22:35.702615 kubelet[3275]: E0115 01:22:35.702566 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7f47b45b95-r7r79" podUID="20583031-73a3-4ec4-aced-e96f0a2ba67b" Jan 15 01:22:36.692051 kubelet[3275]: E0115 01:22:36.691130 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5b8cdf5dcc-grnmf" podUID="fc1c990f-1003-460d-a72d-34a2a5fb4d83" Jan 15 01:22:39.694281 kubelet[3275]: E0115 01:22:39.694179 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5b8cdf5dcc-9rs84" podUID="6ebcb69d-4b71-4631-9554-a5f179cc05ba" Jan 15 01:22:41.694763 kubelet[3275]: E0115 01:22:41.694729 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qxn98" podUID="19b9076e-57b5-41dc-b303-63eafd79e78c" Jan 15 01:22:43.708364 kubelet[3275]: E0115 01:22:43.708308 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-9c45d7c9c-l7rhq" podUID="0d14115d-26fb-4eac-a6b9-b5aa96406bb8" Jan 15 01:22:47.694572 kubelet[3275]: E0115 01:22:47.694530 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5b8cdf5dcc-grnmf" podUID="fc1c990f-1003-460d-a72d-34a2a5fb4d83" Jan 15 01:22:47.696500 kubelet[3275]: E0115 01:22:47.696463 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-srjh9" podUID="ddb26c79-6272-4ee5-ba41-ad8ec552e6c6" Jan 15 01:22:47.698767 kubelet[3275]: E0115 01:22:47.698686 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7f47b45b95-r7r79" podUID="20583031-73a3-4ec4-aced-e96f0a2ba67b" Jan 15 01:22:52.692546 kubelet[3275]: E0115 01:22:52.692494 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qxn98" podUID="19b9076e-57b5-41dc-b303-63eafd79e78c" Jan 15 01:22:53.692680 kubelet[3275]: E0115 01:22:53.692558 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5b8cdf5dcc-9rs84" podUID="6ebcb69d-4b71-4631-9554-a5f179cc05ba" Jan 15 01:22:58.692451 kubelet[3275]: E0115 01:22:58.692392 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-9c45d7c9c-l7rhq" podUID="0d14115d-26fb-4eac-a6b9-b5aa96406bb8" Jan 15 01:22:58.694781 kubelet[3275]: E0115 01:22:58.694746 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7f47b45b95-r7r79" podUID="20583031-73a3-4ec4-aced-e96f0a2ba67b" Jan 15 01:23:00.693625 kubelet[3275]: E0115 01:23:00.693487 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-srjh9" podUID="ddb26c79-6272-4ee5-ba41-ad8ec552e6c6" Jan 15 01:23:01.694089 kubelet[3275]: E0115 01:23:01.694056 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5b8cdf5dcc-grnmf" podUID="fc1c990f-1003-460d-a72d-34a2a5fb4d83" Jan 15 01:23:06.691512 kubelet[3275]: E0115 01:23:06.691460 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5b8cdf5dcc-9rs84" podUID="6ebcb69d-4b71-4631-9554-a5f179cc05ba" Jan 15 01:23:07.695484 kubelet[3275]: E0115 01:23:07.695171 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qxn98" podUID="19b9076e-57b5-41dc-b303-63eafd79e78c" Jan 15 01:23:11.694218 kubelet[3275]: E0115 01:23:11.694170 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7f47b45b95-r7r79" podUID="20583031-73a3-4ec4-aced-e96f0a2ba67b" Jan 15 01:23:12.692392 kubelet[3275]: E0115 01:23:12.692357 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-srjh9" podUID="ddb26c79-6272-4ee5-ba41-ad8ec552e6c6" Jan 15 01:23:13.693928 kubelet[3275]: E0115 01:23:13.693505 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-9c45d7c9c-l7rhq" podUID="0d14115d-26fb-4eac-a6b9-b5aa96406bb8" Jan 15 01:23:15.694184 kubelet[3275]: E0115 01:23:15.694149 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5b8cdf5dcc-grnmf" podUID="fc1c990f-1003-460d-a72d-34a2a5fb4d83" Jan 15 01:23:18.692567 kubelet[3275]: E0115 01:23:18.692455 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5b8cdf5dcc-9rs84" podUID="6ebcb69d-4b71-4631-9554-a5f179cc05ba" Jan 15 01:23:18.695091 kubelet[3275]: E0115 01:23:18.693065 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qxn98" podUID="19b9076e-57b5-41dc-b303-63eafd79e78c" Jan 15 01:23:25.693002 kubelet[3275]: E0115 01:23:25.692929 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-srjh9" podUID="ddb26c79-6272-4ee5-ba41-ad8ec552e6c6" Jan 15 01:23:26.692356 kubelet[3275]: E0115 01:23:26.692287 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-9c45d7c9c-l7rhq" podUID="0d14115d-26fb-4eac-a6b9-b5aa96406bb8" Jan 15 01:23:26.692731 kubelet[3275]: E0115 01:23:26.692706 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7f47b45b95-r7r79" podUID="20583031-73a3-4ec4-aced-e96f0a2ba67b" Jan 15 01:23:29.692683 containerd[1715]: time="2026-01-15T01:23:29.692618669Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 15 01:23:30.034766 containerd[1715]: time="2026-01-15T01:23:30.034519151Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 01:23:30.035738 containerd[1715]: time="2026-01-15T01:23:30.035704399Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 15 01:23:30.035794 containerd[1715]: time="2026-01-15T01:23:30.035784667Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 15 01:23:30.036020 kubelet[3275]: E0115 01:23:30.035933 3275 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 01:23:30.036020 kubelet[3275]: E0115 01:23:30.035980 3275 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 01:23:30.036766 kubelet[3275]: E0115 01:23:30.036721 3275 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mmtnw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-5b8cdf5dcc-grnmf_calico-apiserver(fc1c990f-1003-460d-a72d-34a2a5fb4d83): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 15 01:23:30.037905 kubelet[3275]: E0115 01:23:30.037878 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5b8cdf5dcc-grnmf" podUID="fc1c990f-1003-460d-a72d-34a2a5fb4d83" Jan 15 01:23:31.693514 containerd[1715]: time="2026-01-15T01:23:31.693480849Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 15 01:23:32.030450 containerd[1715]: time="2026-01-15T01:23:32.030226777Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 01:23:32.031529 containerd[1715]: time="2026-01-15T01:23:32.031470163Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 15 01:23:32.031600 containerd[1715]: time="2026-01-15T01:23:32.031505908Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 15 01:23:32.031788 kubelet[3275]: E0115 01:23:32.031755 3275 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 01:23:32.032080 kubelet[3275]: E0115 01:23:32.031800 3275 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 01:23:32.032080 kubelet[3275]: E0115 01:23:32.031910 3275 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kcc99,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-5b8cdf5dcc-9rs84_calico-apiserver(6ebcb69d-4b71-4631-9554-a5f179cc05ba): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 15 01:23:32.033345 kubelet[3275]: E0115 01:23:32.033319 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5b8cdf5dcc-9rs84" podUID="6ebcb69d-4b71-4631-9554-a5f179cc05ba" Jan 15 01:23:32.692807 kubelet[3275]: E0115 01:23:32.692127 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qxn98" podUID="19b9076e-57b5-41dc-b303-63eafd79e78c" Jan 15 01:23:33.776758 systemd[1]: Started sshd@7-10.0.7.78:22-185.215.180.111:38362.service - OpenSSH per-connection server daemon (185.215.180.111:38362). Jan 15 01:23:33.776000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-10.0.7.78:22-185.215.180.111:38362 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:23:33.778350 kernel: kauditd_printk_skb: 102 callbacks suppressed Jan 15 01:23:33.778496 kernel: audit: type=1130 audit(1768440213.776:741): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-10.0.7.78:22-185.215.180.111:38362 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:23:33.899293 sshd[5505]: Connection closed by authenticating user root 185.215.180.111 port 38362 [preauth] Jan 15 01:23:33.898000 audit[5505]: USER_ERR pid=5505 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=185.215.180.111 addr=185.215.180.111 terminal=ssh res=failed' Jan 15 01:23:33.901402 systemd[1]: sshd@7-10.0.7.78:22-185.215.180.111:38362.service: Deactivated successfully. Jan 15 01:23:33.900000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-10.0.7.78:22-185.215.180.111:38362 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:23:33.904501 kernel: audit: type=1109 audit(1768440213.898:742): pid=5505 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=185.215.180.111 addr=185.215.180.111 terminal=ssh res=failed' Jan 15 01:23:33.904545 kernel: audit: type=1131 audit(1768440213.900:743): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-10.0.7.78:22-185.215.180.111:38362 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:23:37.693420 containerd[1715]: time="2026-01-15T01:23:37.693384890Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 15 01:23:38.014495 containerd[1715]: time="2026-01-15T01:23:38.014359900Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 01:23:38.016052 containerd[1715]: time="2026-01-15T01:23:38.015991193Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 15 01:23:38.016148 containerd[1715]: time="2026-01-15T01:23:38.016027517Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 15 01:23:38.016318 kubelet[3275]: E0115 01:23:38.016283 3275 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 15 01:23:38.016566 kubelet[3275]: E0115 01:23:38.016331 3275 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 15 01:23:38.016566 kubelet[3275]: E0115 01:23:38.016423 3275 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:8086b71984894a589b5b3e200fb6432c,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2z9kn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7f47b45b95-r7r79_calico-system(20583031-73a3-4ec4-aced-e96f0a2ba67b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 15 01:23:38.018717 containerd[1715]: time="2026-01-15T01:23:38.018685887Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 15 01:23:38.352741 containerd[1715]: time="2026-01-15T01:23:38.352178399Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 01:23:38.353391 containerd[1715]: time="2026-01-15T01:23:38.353362968Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 15 01:23:38.355022 containerd[1715]: time="2026-01-15T01:23:38.353443686Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 15 01:23:38.355085 kubelet[3275]: E0115 01:23:38.353564 3275 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 15 01:23:38.355085 kubelet[3275]: E0115 01:23:38.353609 3275 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 15 01:23:38.355085 kubelet[3275]: E0115 01:23:38.353723 3275 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2z9kn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7f47b45b95-r7r79_calico-system(20583031-73a3-4ec4-aced-e96f0a2ba67b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 15 01:23:38.355085 kubelet[3275]: E0115 01:23:38.355039 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7f47b45b95-r7r79" podUID="20583031-73a3-4ec4-aced-e96f0a2ba67b" Jan 15 01:23:38.692657 containerd[1715]: time="2026-01-15T01:23:38.692624280Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 15 01:23:39.025514 containerd[1715]: time="2026-01-15T01:23:39.025200663Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 01:23:39.026649 containerd[1715]: time="2026-01-15T01:23:39.026544272Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 15 01:23:39.026649 containerd[1715]: time="2026-01-15T01:23:39.026607485Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 15 01:23:39.026784 kubelet[3275]: E0115 01:23:39.026740 3275 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 15 01:23:39.026990 kubelet[3275]: E0115 01:23:39.026784 3275 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 15 01:23:39.027063 kubelet[3275]: E0115 01:23:39.026926 3275 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6lc26,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-srjh9_calico-system(ddb26c79-6272-4ee5-ba41-ad8ec552e6c6): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 15 01:23:39.028880 containerd[1715]: time="2026-01-15T01:23:39.028857024Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 15 01:23:39.360768 containerd[1715]: time="2026-01-15T01:23:39.360389410Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 01:23:39.363124 containerd[1715]: time="2026-01-15T01:23:39.361589151Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 15 01:23:39.363124 containerd[1715]: time="2026-01-15T01:23:39.361668229Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 15 01:23:39.363266 kubelet[3275]: E0115 01:23:39.361816 3275 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 15 01:23:39.363266 kubelet[3275]: E0115 01:23:39.361859 3275 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 15 01:23:39.363266 kubelet[3275]: E0115 01:23:39.361969 3275 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6lc26,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-srjh9_calico-system(ddb26c79-6272-4ee5-ba41-ad8ec552e6c6): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 15 01:23:39.363266 kubelet[3275]: E0115 01:23:39.363068 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-srjh9" podUID="ddb26c79-6272-4ee5-ba41-ad8ec552e6c6" Jan 15 01:23:41.691863 containerd[1715]: time="2026-01-15T01:23:41.691805382Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 15 01:23:42.024054 containerd[1715]: time="2026-01-15T01:23:42.023605817Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 01:23:42.025025 containerd[1715]: time="2026-01-15T01:23:42.024933659Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 15 01:23:42.025025 containerd[1715]: time="2026-01-15T01:23:42.024961448Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 15 01:23:42.025187 kubelet[3275]: E0115 01:23:42.025159 3275 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 15 01:23:42.025651 kubelet[3275]: E0115 01:23:42.025457 3275 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 15 01:23:42.025651 kubelet[3275]: E0115 01:23:42.025603 3275 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hxp5p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-9c45d7c9c-l7rhq_calico-system(0d14115d-26fb-4eac-a6b9-b5aa96406bb8): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 15 01:23:42.026817 kubelet[3275]: E0115 01:23:42.026770 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-9c45d7c9c-l7rhq" podUID="0d14115d-26fb-4eac-a6b9-b5aa96406bb8" Jan 15 01:23:42.693231 kubelet[3275]: E0115 01:23:42.693187 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5b8cdf5dcc-grnmf" podUID="fc1c990f-1003-460d-a72d-34a2a5fb4d83" Jan 15 01:23:45.691849 kubelet[3275]: E0115 01:23:45.691797 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5b8cdf5dcc-9rs84" podUID="6ebcb69d-4b71-4631-9554-a5f179cc05ba" Jan 15 01:23:47.695362 containerd[1715]: time="2026-01-15T01:23:47.695322594Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 15 01:23:48.026560 containerd[1715]: time="2026-01-15T01:23:48.026103940Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 01:23:48.028146 containerd[1715]: time="2026-01-15T01:23:48.027995239Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 15 01:23:48.028146 containerd[1715]: time="2026-01-15T01:23:48.028107591Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 15 01:23:48.028326 kubelet[3275]: E0115 01:23:48.028275 3275 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 15 01:23:48.028326 kubelet[3275]: E0115 01:23:48.028322 3275 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 15 01:23:48.028608 kubelet[3275]: E0115 01:23:48.028445 3275 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bxqh6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-qxn98_calico-system(19b9076e-57b5-41dc-b303-63eafd79e78c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 15 01:23:48.029832 kubelet[3275]: E0115 01:23:48.029795 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qxn98" podUID="19b9076e-57b5-41dc-b303-63eafd79e78c" Jan 15 01:23:48.692769 kubelet[3275]: E0115 01:23:48.692547 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7f47b45b95-r7r79" podUID="20583031-73a3-4ec4-aced-e96f0a2ba67b" Jan 15 01:23:53.696577 kubelet[3275]: E0115 01:23:53.696457 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5b8cdf5dcc-grnmf" podUID="fc1c990f-1003-460d-a72d-34a2a5fb4d83" Jan 15 01:23:53.699037 kubelet[3275]: E0115 01:23:53.698137 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-srjh9" podUID="ddb26c79-6272-4ee5-ba41-ad8ec552e6c6" Jan 15 01:23:53.699814 kubelet[3275]: E0115 01:23:53.699715 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-9c45d7c9c-l7rhq" podUID="0d14115d-26fb-4eac-a6b9-b5aa96406bb8" Jan 15 01:23:58.691925 kubelet[3275]: E0115 01:23:58.691651 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5b8cdf5dcc-9rs84" podUID="6ebcb69d-4b71-4631-9554-a5f179cc05ba" Jan 15 01:24:00.693167 kubelet[3275]: E0115 01:24:00.693102 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7f47b45b95-r7r79" podUID="20583031-73a3-4ec4-aced-e96f0a2ba67b" Jan 15 01:24:01.693556 kubelet[3275]: E0115 01:24:01.693522 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qxn98" podUID="19b9076e-57b5-41dc-b303-63eafd79e78c" Jan 15 01:24:06.692689 kubelet[3275]: E0115 01:24:06.692649 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-srjh9" podUID="ddb26c79-6272-4ee5-ba41-ad8ec552e6c6" Jan 15 01:24:07.695163 kubelet[3275]: E0115 01:24:07.695123 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5b8cdf5dcc-grnmf" podUID="fc1c990f-1003-460d-a72d-34a2a5fb4d83" Jan 15 01:24:08.692161 kubelet[3275]: E0115 01:24:08.692066 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-9c45d7c9c-l7rhq" podUID="0d14115d-26fb-4eac-a6b9-b5aa96406bb8" Jan 15 01:24:10.692199 kubelet[3275]: E0115 01:24:10.692146 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5b8cdf5dcc-9rs84" podUID="6ebcb69d-4b71-4631-9554-a5f179cc05ba" Jan 15 01:24:11.694997 kubelet[3275]: E0115 01:24:11.694958 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7f47b45b95-r7r79" podUID="20583031-73a3-4ec4-aced-e96f0a2ba67b" Jan 15 01:24:14.691980 kubelet[3275]: E0115 01:24:14.691934 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qxn98" podUID="19b9076e-57b5-41dc-b303-63eafd79e78c" Jan 15 01:24:18.691875 kubelet[3275]: E0115 01:24:18.691786 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-srjh9" podUID="ddb26c79-6272-4ee5-ba41-ad8ec552e6c6" Jan 15 01:24:20.692215 kubelet[3275]: E0115 01:24:20.692077 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-9c45d7c9c-l7rhq" podUID="0d14115d-26fb-4eac-a6b9-b5aa96406bb8" Jan 15 01:24:21.441974 systemd[2228]: Created slice background.slice - User Background Tasks Slice. Jan 15 01:24:21.443431 systemd[2228]: Starting systemd-tmpfiles-clean.service - Cleanup of User's Temporary Files and Directories... Jan 15 01:24:21.491897 systemd[2228]: Finished systemd-tmpfiles-clean.service - Cleanup of User's Temporary Files and Directories. Jan 15 01:24:22.692432 kubelet[3275]: E0115 01:24:22.692314 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7f47b45b95-r7r79" podUID="20583031-73a3-4ec4-aced-e96f0a2ba67b" Jan 15 01:24:22.693644 kubelet[3275]: E0115 01:24:22.692321 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5b8cdf5dcc-grnmf" podUID="fc1c990f-1003-460d-a72d-34a2a5fb4d83" Jan 15 01:24:23.694435 kubelet[3275]: E0115 01:24:23.694161 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5b8cdf5dcc-9rs84" podUID="6ebcb69d-4b71-4631-9554-a5f179cc05ba" Jan 15 01:24:26.692791 kubelet[3275]: E0115 01:24:26.692752 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qxn98" podUID="19b9076e-57b5-41dc-b303-63eafd79e78c" Jan 15 01:24:30.692992 kubelet[3275]: E0115 01:24:30.692950 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-srjh9" podUID="ddb26c79-6272-4ee5-ba41-ad8ec552e6c6" Jan 15 01:24:34.691518 kubelet[3275]: E0115 01:24:34.691483 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-9c45d7c9c-l7rhq" podUID="0d14115d-26fb-4eac-a6b9-b5aa96406bb8" Jan 15 01:24:35.692536 kubelet[3275]: E0115 01:24:35.692442 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5b8cdf5dcc-grnmf" podUID="fc1c990f-1003-460d-a72d-34a2a5fb4d83" Jan 15 01:24:36.692549 kubelet[3275]: E0115 01:24:36.692508 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7f47b45b95-r7r79" podUID="20583031-73a3-4ec4-aced-e96f0a2ba67b" Jan 15 01:24:37.691599 kubelet[3275]: E0115 01:24:37.691553 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5b8cdf5dcc-9rs84" podUID="6ebcb69d-4b71-4631-9554-a5f179cc05ba" Jan 15 01:24:38.691444 kubelet[3275]: E0115 01:24:38.691392 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qxn98" podUID="19b9076e-57b5-41dc-b303-63eafd79e78c" Jan 15 01:24:41.693045 kubelet[3275]: E0115 01:24:41.692495 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-srjh9" podUID="ddb26c79-6272-4ee5-ba41-ad8ec552e6c6" Jan 15 01:24:45.692565 kubelet[3275]: E0115 01:24:45.692529 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-9c45d7c9c-l7rhq" podUID="0d14115d-26fb-4eac-a6b9-b5aa96406bb8" Jan 15 01:24:47.696080 kubelet[3275]: E0115 01:24:47.693055 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5b8cdf5dcc-grnmf" podUID="fc1c990f-1003-460d-a72d-34a2a5fb4d83" Jan 15 01:24:47.696080 kubelet[3275]: E0115 01:24:47.694115 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7f47b45b95-r7r79" podUID="20583031-73a3-4ec4-aced-e96f0a2ba67b" Jan 15 01:24:48.875288 containerd[1715]: time="2026-01-15T01:24:48.875185881Z" level=info msg="container event discarded" container=d70fb3c91a76704820c702dd29ae91967387196354715a9ffca1bc524c155754 type=CONTAINER_CREATED_EVENT Jan 15 01:24:48.886508 containerd[1715]: time="2026-01-15T01:24:48.886445066Z" level=info msg="container event discarded" container=d70fb3c91a76704820c702dd29ae91967387196354715a9ffca1bc524c155754 type=CONTAINER_STARTED_EVENT Jan 15 01:24:48.909757 containerd[1715]: time="2026-01-15T01:24:48.909667484Z" level=info msg="container event discarded" container=3dc3cd569446c1b42cdca87f99320569116bd28fe2717e239d4e7ffbe1e35e8b type=CONTAINER_CREATED_EVENT Jan 15 01:24:48.943944 containerd[1715]: time="2026-01-15T01:24:48.943878778Z" level=info msg="container event discarded" container=791331ecc3fbcfd3a419281927a10fa61fb8ebab84bb774fe5593cd2d9845502 type=CONTAINER_CREATED_EVENT Jan 15 01:24:48.943944 containerd[1715]: time="2026-01-15T01:24:48.943924384Z" level=info msg="container event discarded" container=791331ecc3fbcfd3a419281927a10fa61fb8ebab84bb774fe5593cd2d9845502 type=CONTAINER_STARTED_EVENT Jan 15 01:24:48.943944 containerd[1715]: time="2026-01-15T01:24:48.943932443Z" level=info msg="container event discarded" container=279a845c3e86c0ac7439cf5f39d2ce421857c446009e53127181cc6fc9face3d type=CONTAINER_CREATED_EVENT Jan 15 01:24:48.943944 containerd[1715]: time="2026-01-15T01:24:48.943939251Z" level=info msg="container event discarded" container=279a845c3e86c0ac7439cf5f39d2ce421857c446009e53127181cc6fc9face3d type=CONTAINER_STARTED_EVENT Jan 15 01:24:48.979146 containerd[1715]: time="2026-01-15T01:24:48.979059900Z" level=info msg="container event discarded" container=9093d92bc22a002b0a04f61e69336bc6100d020a7f24d5d044037bb29d99be8f type=CONTAINER_CREATED_EVENT Jan 15 01:24:48.991322 containerd[1715]: time="2026-01-15T01:24:48.991265117Z" level=info msg="container event discarded" container=e1a9a69f281c4939cb444ed1bfc85b218a064c52a9c47a646377d7c719b09116 type=CONTAINER_CREATED_EVENT Jan 15 01:24:49.022575 containerd[1715]: time="2026-01-15T01:24:49.022464197Z" level=info msg="container event discarded" container=3dc3cd569446c1b42cdca87f99320569116bd28fe2717e239d4e7ffbe1e35e8b type=CONTAINER_STARTED_EVENT Jan 15 01:24:49.099843 containerd[1715]: time="2026-01-15T01:24:49.099734378Z" level=info msg="container event discarded" container=9093d92bc22a002b0a04f61e69336bc6100d020a7f24d5d044037bb29d99be8f type=CONTAINER_STARTED_EVENT Jan 15 01:24:49.137247 containerd[1715]: time="2026-01-15T01:24:49.137105044Z" level=info msg="container event discarded" container=e1a9a69f281c4939cb444ed1bfc85b218a064c52a9c47a646377d7c719b09116 type=CONTAINER_STARTED_EVENT Jan 15 01:24:51.693313 kubelet[3275]: E0115 01:24:51.692823 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5b8cdf5dcc-9rs84" podUID="6ebcb69d-4b71-4631-9554-a5f179cc05ba" Jan 15 01:24:52.694155 kubelet[3275]: E0115 01:24:52.694056 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-srjh9" podUID="ddb26c79-6272-4ee5-ba41-ad8ec552e6c6" Jan 15 01:24:53.695064 kubelet[3275]: E0115 01:24:53.695027 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qxn98" podUID="19b9076e-57b5-41dc-b303-63eafd79e78c" Jan 15 01:24:57.695568 kubelet[3275]: E0115 01:24:57.695529 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-9c45d7c9c-l7rhq" podUID="0d14115d-26fb-4eac-a6b9-b5aa96406bb8" Jan 15 01:24:58.691840 kubelet[3275]: E0115 01:24:58.691787 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7f47b45b95-r7r79" podUID="20583031-73a3-4ec4-aced-e96f0a2ba67b" Jan 15 01:24:59.041553 containerd[1715]: time="2026-01-15T01:24:59.041366519Z" level=info msg="container event discarded" container=3520410b08a151ec56461babf7a8380cdb83751639516e43e6600cd006865915 type=CONTAINER_CREATED_EVENT Jan 15 01:24:59.041553 containerd[1715]: time="2026-01-15T01:24:59.041427319Z" level=info msg="container event discarded" container=3520410b08a151ec56461babf7a8380cdb83751639516e43e6600cd006865915 type=CONTAINER_STARTED_EVENT Jan 15 01:24:59.069745 containerd[1715]: time="2026-01-15T01:24:59.069670337Z" level=info msg="container event discarded" container=47d67f95b27456339b0754273a70cc8465782f3fd9bfc12096bea4a881b8d349 type=CONTAINER_CREATED_EVENT Jan 15 01:24:59.162078 containerd[1715]: time="2026-01-15T01:24:59.161956882Z" level=info msg="container event discarded" container=47d67f95b27456339b0754273a70cc8465782f3fd9bfc12096bea4a881b8d349 type=CONTAINER_STARTED_EVENT Jan 15 01:24:59.312642 containerd[1715]: time="2026-01-15T01:24:59.312364706Z" level=info msg="container event discarded" container=56493a1652b1ff3b538e976ba46d0c728db0e46019dc8a7e221703e74df8743d type=CONTAINER_CREATED_EVENT Jan 15 01:24:59.312642 containerd[1715]: time="2026-01-15T01:24:59.312426323Z" level=info msg="container event discarded" container=56493a1652b1ff3b538e976ba46d0c728db0e46019dc8a7e221703e74df8743d type=CONTAINER_STARTED_EVENT Jan 15 01:25:02.403685 containerd[1715]: time="2026-01-15T01:25:02.403626605Z" level=info msg="container event discarded" container=88cdc27263dc5abba19080600f3cc90db43121bff418f932f79038ebbaade986 type=CONTAINER_CREATED_EVENT Jan 15 01:25:02.454178 containerd[1715]: time="2026-01-15T01:25:02.454122063Z" level=info msg="container event discarded" container=88cdc27263dc5abba19080600f3cc90db43121bff418f932f79038ebbaade986 type=CONTAINER_STARTED_EVENT Jan 15 01:25:02.691385 kubelet[3275]: E0115 01:25:02.691349 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5b8cdf5dcc-9rs84" podUID="6ebcb69d-4b71-4631-9554-a5f179cc05ba" Jan 15 01:25:02.692513 kubelet[3275]: E0115 01:25:02.691349 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5b8cdf5dcc-grnmf" podUID="fc1c990f-1003-460d-a72d-34a2a5fb4d83" Jan 15 01:25:06.693113 kubelet[3275]: E0115 01:25:06.693062 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-srjh9" podUID="ddb26c79-6272-4ee5-ba41-ad8ec552e6c6" Jan 15 01:25:07.692297 kubelet[3275]: E0115 01:25:07.692243 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qxn98" podUID="19b9076e-57b5-41dc-b303-63eafd79e78c" Jan 15 01:25:12.156960 containerd[1715]: time="2026-01-15T01:25:12.156859249Z" level=info msg="container event discarded" container=58cedc76f5871646a6616b256dcb25cab9110d9829828b3a9a84dc5712f9f8b0 type=CONTAINER_CREATED_EVENT Jan 15 01:25:12.156960 containerd[1715]: time="2026-01-15T01:25:12.156904447Z" level=info msg="container event discarded" container=58cedc76f5871646a6616b256dcb25cab9110d9829828b3a9a84dc5712f9f8b0 type=CONTAINER_STARTED_EVENT Jan 15 01:25:12.311285 containerd[1715]: time="2026-01-15T01:25:12.311207484Z" level=info msg="container event discarded" container=3858336ede90dd0cd6290b3a3da203dbc97fdf8aa39eef530c11b26e312fc097 type=CONTAINER_CREATED_EVENT Jan 15 01:25:12.311285 containerd[1715]: time="2026-01-15T01:25:12.311277292Z" level=info msg="container event discarded" container=3858336ede90dd0cd6290b3a3da203dbc97fdf8aa39eef530c11b26e312fc097 type=CONTAINER_STARTED_EVENT Jan 15 01:25:12.692304 kubelet[3275]: E0115 01:25:12.692135 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-9c45d7c9c-l7rhq" podUID="0d14115d-26fb-4eac-a6b9-b5aa96406bb8" Jan 15 01:25:13.692726 kubelet[3275]: E0115 01:25:13.692694 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5b8cdf5dcc-grnmf" podUID="fc1c990f-1003-460d-a72d-34a2a5fb4d83" Jan 15 01:25:13.694135 kubelet[3275]: E0115 01:25:13.694110 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7f47b45b95-r7r79" podUID="20583031-73a3-4ec4-aced-e96f0a2ba67b" Jan 15 01:25:14.818534 containerd[1715]: time="2026-01-15T01:25:14.818457935Z" level=info msg="container event discarded" container=8652b53c4734ec25ca2b38a2b31ab1831784a212311c6cad2ed5542d7cf90f9d type=CONTAINER_CREATED_EVENT Jan 15 01:25:14.893277 containerd[1715]: time="2026-01-15T01:25:14.893108753Z" level=info msg="container event discarded" container=8652b53c4734ec25ca2b38a2b31ab1831784a212311c6cad2ed5542d7cf90f9d type=CONTAINER_STARTED_EVENT Jan 15 01:25:16.411243 containerd[1715]: time="2026-01-15T01:25:16.411167186Z" level=info msg="container event discarded" container=0ca209400a84820cfecd9a7f1d1d5d7ff41066ff0bef8b67964df5bd638bfaff type=CONTAINER_CREATED_EVENT Jan 15 01:25:16.527695 containerd[1715]: time="2026-01-15T01:25:16.527626993Z" level=info msg="container event discarded" container=0ca209400a84820cfecd9a7f1d1d5d7ff41066ff0bef8b67964df5bd638bfaff type=CONTAINER_STARTED_EVENT Jan 15 01:25:17.692701 kubelet[3275]: E0115 01:25:17.691898 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5b8cdf5dcc-9rs84" podUID="6ebcb69d-4b71-4631-9554-a5f179cc05ba" Jan 15 01:25:17.948058 containerd[1715]: time="2026-01-15T01:25:17.947888168Z" level=info msg="container event discarded" container=0ca209400a84820cfecd9a7f1d1d5d7ff41066ff0bef8b67964df5bd638bfaff type=CONTAINER_STOPPED_EVENT Jan 15 01:25:20.693199 kubelet[3275]: E0115 01:25:20.693097 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-srjh9" podUID="ddb26c79-6272-4ee5-ba41-ad8ec552e6c6" Jan 15 01:25:21.693288 kubelet[3275]: E0115 01:25:21.693081 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qxn98" podUID="19b9076e-57b5-41dc-b303-63eafd79e78c" Jan 15 01:25:22.719311 containerd[1715]: time="2026-01-15T01:25:22.719235745Z" level=info msg="container event discarded" container=cdd606cbccbb62f7ef703328adfca06ad94e27c4a7043cfe4959431668392906 type=CONTAINER_CREATED_EVENT Jan 15 01:25:22.835054 containerd[1715]: time="2026-01-15T01:25:22.834914348Z" level=info msg="container event discarded" container=cdd606cbccbb62f7ef703328adfca06ad94e27c4a7043cfe4959431668392906 type=CONTAINER_STARTED_EVENT Jan 15 01:25:24.693737 kubelet[3275]: E0115 01:25:24.693681 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7f47b45b95-r7r79" podUID="20583031-73a3-4ec4-aced-e96f0a2ba67b" Jan 15 01:25:25.671395 containerd[1715]: time="2026-01-15T01:25:25.671328960Z" level=info msg="container event discarded" container=cdd606cbccbb62f7ef703328adfca06ad94e27c4a7043cfe4959431668392906 type=CONTAINER_STOPPED_EVENT Jan 15 01:25:25.693686 kubelet[3275]: E0115 01:25:25.692991 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5b8cdf5dcc-grnmf" podUID="fc1c990f-1003-460d-a72d-34a2a5fb4d83" Jan 15 01:25:27.692707 kubelet[3275]: E0115 01:25:27.692539 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-9c45d7c9c-l7rhq" podUID="0d14115d-26fb-4eac-a6b9-b5aa96406bb8" Jan 15 01:25:32.693338 kubelet[3275]: E0115 01:25:32.693168 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qxn98" podUID="19b9076e-57b5-41dc-b303-63eafd79e78c" Jan 15 01:25:32.693338 kubelet[3275]: E0115 01:25:32.693295 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5b8cdf5dcc-9rs84" podUID="6ebcb69d-4b71-4631-9554-a5f179cc05ba" Jan 15 01:25:32.694869 kubelet[3275]: E0115 01:25:32.694758 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-srjh9" podUID="ddb26c79-6272-4ee5-ba41-ad8ec552e6c6" Jan 15 01:25:33.890548 containerd[1715]: time="2026-01-15T01:25:33.890470773Z" level=info msg="container event discarded" container=251a633342c9856debcc4b17d24c1ee7865a28e9062bbbd92b7776e2887078c6 type=CONTAINER_CREATED_EVENT Jan 15 01:25:34.048058 containerd[1715]: time="2026-01-15T01:25:34.047777108Z" level=info msg="container event discarded" container=251a633342c9856debcc4b17d24c1ee7865a28e9062bbbd92b7776e2887078c6 type=CONTAINER_STARTED_EVENT Jan 15 01:25:35.659132 containerd[1715]: time="2026-01-15T01:25:35.659081849Z" level=info msg="container event discarded" container=c32dcf090c217802f420d7639824877060de5eba12d8fa6f4b5558aa66f84c21 type=CONTAINER_CREATED_EVENT Jan 15 01:25:35.659132 containerd[1715]: time="2026-01-15T01:25:35.659122514Z" level=info msg="container event discarded" container=c32dcf090c217802f420d7639824877060de5eba12d8fa6f4b5558aa66f84c21 type=CONTAINER_STARTED_EVENT Jan 15 01:25:37.030286 containerd[1715]: time="2026-01-15T01:25:37.030227062Z" level=info msg="container event discarded" container=a92b70de798c42287cf1b029e9db65123cf942cb0e3a9a9b62fd427fdad24a4e type=CONTAINER_CREATED_EVENT Jan 15 01:25:37.030286 containerd[1715]: time="2026-01-15T01:25:37.030279481Z" level=info msg="container event discarded" container=a92b70de798c42287cf1b029e9db65123cf942cb0e3a9a9b62fd427fdad24a4e type=CONTAINER_STARTED_EVENT Jan 15 01:25:38.545724 containerd[1715]: time="2026-01-15T01:25:38.545667055Z" level=info msg="container event discarded" container=9df20cc7a5dd788b927021a7f08cdd8ce41085252a5f1b8422be7a87319f2efe type=CONTAINER_CREATED_EVENT Jan 15 01:25:38.545724 containerd[1715]: time="2026-01-15T01:25:38.545714604Z" level=info msg="container event discarded" container=9df20cc7a5dd788b927021a7f08cdd8ce41085252a5f1b8422be7a87319f2efe type=CONTAINER_STARTED_EVENT Jan 15 01:25:38.692458 kubelet[3275]: E0115 01:25:38.692408 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5b8cdf5dcc-grnmf" podUID="fc1c990f-1003-460d-a72d-34a2a5fb4d83" Jan 15 01:25:38.693810 kubelet[3275]: E0115 01:25:38.692907 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7f47b45b95-r7r79" podUID="20583031-73a3-4ec4-aced-e96f0a2ba67b" Jan 15 01:25:38.707129 containerd[1715]: time="2026-01-15T01:25:38.707073719Z" level=info msg="container event discarded" container=d3cf7ecce8fb53e12f148d1e8771c29e29afbb26ab18a6f89d31c7e7c9337fb6 type=CONTAINER_CREATED_EVENT Jan 15 01:25:38.707376 containerd[1715]: time="2026-01-15T01:25:38.707202480Z" level=info msg="container event discarded" container=d3cf7ecce8fb53e12f148d1e8771c29e29afbb26ab18a6f89d31c7e7c9337fb6 type=CONTAINER_STARTED_EVENT Jan 15 01:25:38.779657 containerd[1715]: time="2026-01-15T01:25:38.779597000Z" level=info msg="container event discarded" container=eb16d5bd27eaa9e7305e77b59d66d888f22181a25dcdb4a161e00c1152606372 type=CONTAINER_CREATED_EVENT Jan 15 01:25:38.779861 containerd[1715]: time="2026-01-15T01:25:38.779739036Z" level=info msg="container event discarded" container=eb16d5bd27eaa9e7305e77b59d66d888f22181a25dcdb4a161e00c1152606372 type=CONTAINER_STARTED_EVENT Jan 15 01:25:38.975690 containerd[1715]: time="2026-01-15T01:25:38.975632306Z" level=info msg="container event discarded" container=2858095dbff5853633bdc43ffd0cff287ff8805206b6b286f3892783102a6bce type=CONTAINER_CREATED_EVENT Jan 15 01:25:38.975690 containerd[1715]: time="2026-01-15T01:25:38.975679916Z" level=info msg="container event discarded" container=2858095dbff5853633bdc43ffd0cff287ff8805206b6b286f3892783102a6bce type=CONTAINER_STARTED_EVENT Jan 15 01:25:38.998256 containerd[1715]: time="2026-01-15T01:25:38.998170093Z" level=info msg="container event discarded" container=a06a9520da530af5014ffe81baa3ee11451d8f8b8bbaec4a74478e6b7c33d1c2 type=CONTAINER_CREATED_EVENT Jan 15 01:25:39.079531 containerd[1715]: time="2026-01-15T01:25:39.079480413Z" level=info msg="container event discarded" container=a06a9520da530af5014ffe81baa3ee11451d8f8b8bbaec4a74478e6b7c33d1c2 type=CONTAINER_STARTED_EVENT Jan 15 01:25:39.693710 kubelet[3275]: E0115 01:25:39.693216 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-9c45d7c9c-l7rhq" podUID="0d14115d-26fb-4eac-a6b9-b5aa96406bb8" Jan 15 01:25:40.060998 containerd[1715]: time="2026-01-15T01:25:40.060628433Z" level=info msg="container event discarded" container=c2d27de49cc1e5ff3a76a5306c6cb922bca01416daa097e131a5dfbddcc353f5 type=CONTAINER_CREATED_EVENT Jan 15 01:25:40.060998 containerd[1715]: time="2026-01-15T01:25:40.060712523Z" level=info msg="container event discarded" container=c2d27de49cc1e5ff3a76a5306c6cb922bca01416daa097e131a5dfbddcc353f5 type=CONTAINER_STARTED_EVENT Jan 15 01:25:40.168046 containerd[1715]: time="2026-01-15T01:25:40.167968323Z" level=info msg="container event discarded" container=79cf64710559020bc7d8be1ba23840e1ef9a45139840807fb51d7e5b1e49dc8e type=CONTAINER_CREATED_EVENT Jan 15 01:25:40.168046 containerd[1715]: time="2026-01-15T01:25:40.168042862Z" level=info msg="container event discarded" container=79cf64710559020bc7d8be1ba23840e1ef9a45139840807fb51d7e5b1e49dc8e type=CONTAINER_STARTED_EVENT Jan 15 01:25:40.188280 containerd[1715]: time="2026-01-15T01:25:40.188229183Z" level=info msg="container event discarded" container=eea86877f924b4a6d7896f35cfde4761534d06b8ef281350c8bbb5f7183a4b41 type=CONTAINER_CREATED_EVENT Jan 15 01:25:40.251500 containerd[1715]: time="2026-01-15T01:25:40.251436269Z" level=info msg="container event discarded" container=eea86877f924b4a6d7896f35cfde4761534d06b8ef281350c8bbb5f7183a4b41 type=CONTAINER_STARTED_EVENT Jan 15 01:25:46.692268 kubelet[3275]: E0115 01:25:46.692047 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qxn98" podUID="19b9076e-57b5-41dc-b303-63eafd79e78c" Jan 15 01:25:46.693388 kubelet[3275]: E0115 01:25:46.692942 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5b8cdf5dcc-9rs84" podUID="6ebcb69d-4b71-4631-9554-a5f179cc05ba" Jan 15 01:25:46.693699 kubelet[3275]: E0115 01:25:46.693595 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-srjh9" podUID="ddb26c79-6272-4ee5-ba41-ad8ec552e6c6" Jan 15 01:25:49.696204 kubelet[3275]: E0115 01:25:49.695831 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7f47b45b95-r7r79" podUID="20583031-73a3-4ec4-aced-e96f0a2ba67b" Jan 15 01:25:51.692381 kubelet[3275]: E0115 01:25:51.691388 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-9c45d7c9c-l7rhq" podUID="0d14115d-26fb-4eac-a6b9-b5aa96406bb8" Jan 15 01:25:53.692442 kubelet[3275]: E0115 01:25:53.692378 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5b8cdf5dcc-grnmf" podUID="fc1c990f-1003-460d-a72d-34a2a5fb4d83" Jan 15 01:26:00.219630 systemd[1]: cri-containerd-9093d92bc22a002b0a04f61e69336bc6100d020a7f24d5d044037bb29d99be8f.scope: Deactivated successfully. Jan 15 01:26:00.220159 systemd[1]: cri-containerd-9093d92bc22a002b0a04f61e69336bc6100d020a7f24d5d044037bb29d99be8f.scope: Consumed 4.037s CPU time, 55.8M memory peak, 248K read from disk. Jan 15 01:26:00.219000 audit: BPF prog-id=256 op=LOAD Jan 15 01:26:00.222070 kernel: audit: type=1334 audit(1768440360.219:744): prog-id=256 op=LOAD Jan 15 01:26:00.221000 audit: BPF prog-id=88 op=UNLOAD Jan 15 01:26:00.223368 containerd[1715]: time="2026-01-15T01:26:00.223341751Z" level=info msg="received container exit event container_id:\"9093d92bc22a002b0a04f61e69336bc6100d020a7f24d5d044037bb29d99be8f\" id:\"9093d92bc22a002b0a04f61e69336bc6100d020a7f24d5d044037bb29d99be8f\" pid:3125 exit_status:1 exited_at:{seconds:1768440360 nanos:223047888}" Jan 15 01:26:00.224064 kernel: audit: type=1334 audit(1768440360.221:745): prog-id=88 op=UNLOAD Jan 15 01:26:00.224000 audit: BPF prog-id=103 op=UNLOAD Jan 15 01:26:00.224000 audit: BPF prog-id=107 op=UNLOAD Jan 15 01:26:00.227370 kernel: audit: type=1334 audit(1768440360.224:746): prog-id=103 op=UNLOAD Jan 15 01:26:00.227833 kernel: audit: type=1334 audit(1768440360.224:747): prog-id=107 op=UNLOAD Jan 15 01:26:00.248981 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-9093d92bc22a002b0a04f61e69336bc6100d020a7f24d5d044037bb29d99be8f-rootfs.mount: Deactivated successfully. Jan 15 01:26:00.454806 kubelet[3275]: E0115 01:26:00.454432 3275 controller.go:195] "Failed to update lease" err="Put \"https://10.0.7.78:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4515-1-0-n-d76f075714?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 15 01:26:00.533690 kubelet[3275]: I0115 01:26:00.532483 3275 scope.go:117] "RemoveContainer" containerID="9093d92bc22a002b0a04f61e69336bc6100d020a7f24d5d044037bb29d99be8f" Jan 15 01:26:00.537043 containerd[1715]: time="2026-01-15T01:26:00.536590933Z" level=info msg="CreateContainer within sandbox \"279a845c3e86c0ac7439cf5f39d2ce421857c446009e53127181cc6fc9face3d\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Jan 15 01:26:00.546442 containerd[1715]: time="2026-01-15T01:26:00.546397704Z" level=info msg="Container 2d4717e338069e10434a6a0c83e027927aa3b4cec66d089943dee9bf1b3d4ed8: CDI devices from CRI Config.CDIDevices: []" Jan 15 01:26:00.561283 containerd[1715]: time="2026-01-15T01:26:00.561251208Z" level=info msg="CreateContainer within sandbox \"279a845c3e86c0ac7439cf5f39d2ce421857c446009e53127181cc6fc9face3d\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"2d4717e338069e10434a6a0c83e027927aa3b4cec66d089943dee9bf1b3d4ed8\"" Jan 15 01:26:00.561839 containerd[1715]: time="2026-01-15T01:26:00.561817773Z" level=info msg="StartContainer for \"2d4717e338069e10434a6a0c83e027927aa3b4cec66d089943dee9bf1b3d4ed8\"" Jan 15 01:26:00.562812 containerd[1715]: time="2026-01-15T01:26:00.562791059Z" level=info msg="connecting to shim 2d4717e338069e10434a6a0c83e027927aa3b4cec66d089943dee9bf1b3d4ed8" address="unix:///run/containerd/s/46514ef6c2b9f049ad34480bfe0774b14eccfd40b36a55a935f7db475f6febe3" protocol=ttrpc version=3 Jan 15 01:26:00.590243 systemd[1]: Started cri-containerd-2d4717e338069e10434a6a0c83e027927aa3b4cec66d089943dee9bf1b3d4ed8.scope - libcontainer container 2d4717e338069e10434a6a0c83e027927aa3b4cec66d089943dee9bf1b3d4ed8. Jan 15 01:26:00.602000 audit: BPF prog-id=257 op=LOAD Jan 15 01:26:00.602000 audit: BPF prog-id=258 op=LOAD Jan 15 01:26:00.605416 kernel: audit: type=1334 audit(1768440360.602:748): prog-id=257 op=LOAD Jan 15 01:26:00.605460 kernel: audit: type=1334 audit(1768440360.602:749): prog-id=258 op=LOAD Jan 15 01:26:00.602000 audit[5725]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b0238 a2=98 a3=0 items=0 ppid=2986 pid=5725 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:26:00.606567 kernel: audit: type=1300 audit(1768440360.602:749): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b0238 a2=98 a3=0 items=0 ppid=2986 pid=5725 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:26:00.602000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3264343731376533333830363965313034333461366130633833653032 Jan 15 01:26:00.602000 audit: BPF prog-id=258 op=UNLOAD Jan 15 01:26:00.616190 kernel: audit: type=1327 audit(1768440360.602:749): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3264343731376533333830363965313034333461366130633833653032 Jan 15 01:26:00.616241 kernel: audit: type=1334 audit(1768440360.602:750): prog-id=258 op=UNLOAD Jan 15 01:26:00.602000 audit[5725]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2986 pid=5725 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:26:00.628156 kernel: audit: type=1300 audit(1768440360.602:750): arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2986 pid=5725 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:26:00.602000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3264343731376533333830363965313034333461366130633833653032 Jan 15 01:26:00.602000 audit: BPF prog-id=259 op=LOAD Jan 15 01:26:00.602000 audit[5725]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b0488 a2=98 a3=0 items=0 ppid=2986 pid=5725 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:26:00.602000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3264343731376533333830363965313034333461366130633833653032 Jan 15 01:26:00.602000 audit: BPF prog-id=260 op=LOAD Jan 15 01:26:00.602000 audit[5725]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c0001b0218 a2=98 a3=0 items=0 ppid=2986 pid=5725 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:26:00.602000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3264343731376533333830363965313034333461366130633833653032 Jan 15 01:26:00.602000 audit: BPF prog-id=260 op=UNLOAD Jan 15 01:26:00.602000 audit[5725]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2986 pid=5725 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:26:00.602000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3264343731376533333830363965313034333461366130633833653032 Jan 15 01:26:00.602000 audit: BPF prog-id=259 op=UNLOAD Jan 15 01:26:00.602000 audit[5725]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2986 pid=5725 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:26:00.602000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3264343731376533333830363965313034333461366130633833653032 Jan 15 01:26:00.602000 audit: BPF prog-id=261 op=LOAD Jan 15 01:26:00.602000 audit[5725]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b06e8 a2=98 a3=0 items=0 ppid=2986 pid=5725 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:26:00.602000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3264343731376533333830363965313034333461366130633833653032 Jan 15 01:26:00.637716 kubelet[3275]: E0115 01:26:00.637571 3275 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.7.78:54928->10.0.7.18:2379: read: connection timed out" Jan 15 01:26:00.660917 containerd[1715]: time="2026-01-15T01:26:00.660868368Z" level=info msg="StartContainer for \"2d4717e338069e10434a6a0c83e027927aa3b4cec66d089943dee9bf1b3d4ed8\" returns successfully" Jan 15 01:26:00.692676 kubelet[3275]: E0115 01:26:00.692646 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5b8cdf5dcc-9rs84" podUID="6ebcb69d-4b71-4631-9554-a5f179cc05ba" Jan 15 01:26:00.693098 kubelet[3275]: E0115 01:26:00.692720 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-qxn98" podUID="19b9076e-57b5-41dc-b303-63eafd79e78c" Jan 15 01:26:00.693098 kubelet[3275]: E0115 01:26:00.692900 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-srjh9" podUID="ddb26c79-6272-4ee5-ba41-ad8ec552e6c6" Jan 15 01:26:00.693098 kubelet[3275]: E0115 01:26:00.692831 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7f47b45b95-r7r79" podUID="20583031-73a3-4ec4-aced-e96f0a2ba67b" Jan 15 01:26:01.387346 systemd[1]: cri-containerd-88cdc27263dc5abba19080600f3cc90db43121bff418f932f79038ebbaade986.scope: Deactivated successfully. Jan 15 01:26:01.387619 systemd[1]: cri-containerd-88cdc27263dc5abba19080600f3cc90db43121bff418f932f79038ebbaade986.scope: Consumed 42.432s CPU time, 112M memory peak. Jan 15 01:26:01.390834 containerd[1715]: time="2026-01-15T01:26:01.389423571Z" level=info msg="received container exit event container_id:\"88cdc27263dc5abba19080600f3cc90db43121bff418f932f79038ebbaade986\" id:\"88cdc27263dc5abba19080600f3cc90db43121bff418f932f79038ebbaade986\" pid:3591 exit_status:1 exited_at:{seconds:1768440361 nanos:388993230}" Jan 15 01:26:01.391000 audit: BPF prog-id=146 op=UNLOAD Jan 15 01:26:01.391000 audit: BPF prog-id=150 op=UNLOAD Jan 15 01:26:01.419659 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-88cdc27263dc5abba19080600f3cc90db43121bff418f932f79038ebbaade986-rootfs.mount: Deactivated successfully. Jan 15 01:26:01.539630 kubelet[3275]: I0115 01:26:01.539586 3275 scope.go:117] "RemoveContainer" containerID="88cdc27263dc5abba19080600f3cc90db43121bff418f932f79038ebbaade986" Jan 15 01:26:01.551492 containerd[1715]: time="2026-01-15T01:26:01.551443790Z" level=info msg="CreateContainer within sandbox \"56493a1652b1ff3b538e976ba46d0c728db0e46019dc8a7e221703e74df8743d\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Jan 15 01:26:01.564642 containerd[1715]: time="2026-01-15T01:26:01.564611550Z" level=info msg="Container bef2d2b87576fdc01870f2d478cd2ffd7661e9eefc5c4779bdf0e7117fbd7951: CDI devices from CRI Config.CDIDevices: []" Jan 15 01:26:01.575935 containerd[1715]: time="2026-01-15T01:26:01.575903124Z" level=info msg="CreateContainer within sandbox \"56493a1652b1ff3b538e976ba46d0c728db0e46019dc8a7e221703e74df8743d\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"bef2d2b87576fdc01870f2d478cd2ffd7661e9eefc5c4779bdf0e7117fbd7951\"" Jan 15 01:26:01.576491 containerd[1715]: time="2026-01-15T01:26:01.576472293Z" level=info msg="StartContainer for \"bef2d2b87576fdc01870f2d478cd2ffd7661e9eefc5c4779bdf0e7117fbd7951\"" Jan 15 01:26:01.577255 containerd[1715]: time="2026-01-15T01:26:01.577223853Z" level=info msg="connecting to shim bef2d2b87576fdc01870f2d478cd2ffd7661e9eefc5c4779bdf0e7117fbd7951" address="unix:///run/containerd/s/44959c5fd90f1d056ac218737007edb472d070383e0c655fe37eb5ab7401d7c9" protocol=ttrpc version=3 Jan 15 01:26:01.605202 systemd[1]: Started cri-containerd-bef2d2b87576fdc01870f2d478cd2ffd7661e9eefc5c4779bdf0e7117fbd7951.scope - libcontainer container bef2d2b87576fdc01870f2d478cd2ffd7661e9eefc5c4779bdf0e7117fbd7951. Jan 15 01:26:01.625000 audit: BPF prog-id=262 op=LOAD Jan 15 01:26:01.625000 audit: BPF prog-id=263 op=LOAD Jan 15 01:26:01.625000 audit[5767]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000106238 a2=98 a3=0 items=0 ppid=3405 pid=5767 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:26:01.625000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6265663264326238373537366664633031383730663264343738636432 Jan 15 01:26:01.627000 audit: BPF prog-id=263 op=UNLOAD Jan 15 01:26:01.627000 audit[5767]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3405 pid=5767 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:26:01.627000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6265663264326238373537366664633031383730663264343738636432 Jan 15 01:26:01.627000 audit: BPF prog-id=264 op=LOAD Jan 15 01:26:01.627000 audit[5767]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000106488 a2=98 a3=0 items=0 ppid=3405 pid=5767 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:26:01.627000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6265663264326238373537366664633031383730663264343738636432 Jan 15 01:26:01.627000 audit: BPF prog-id=265 op=LOAD Jan 15 01:26:01.627000 audit[5767]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000106218 a2=98 a3=0 items=0 ppid=3405 pid=5767 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:26:01.627000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6265663264326238373537366664633031383730663264343738636432 Jan 15 01:26:01.627000 audit: BPF prog-id=265 op=UNLOAD Jan 15 01:26:01.627000 audit[5767]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3405 pid=5767 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:26:01.627000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6265663264326238373537366664633031383730663264343738636432 Jan 15 01:26:01.627000 audit: BPF prog-id=264 op=UNLOAD Jan 15 01:26:01.627000 audit[5767]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3405 pid=5767 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:26:01.627000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6265663264326238373537366664633031383730663264343738636432 Jan 15 01:26:01.627000 audit: BPF prog-id=266 op=LOAD Jan 15 01:26:01.627000 audit[5767]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001066e8 a2=98 a3=0 items=0 ppid=3405 pid=5767 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:26:01.627000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6265663264326238373537366664633031383730663264343738636432 Jan 15 01:26:01.650309 containerd[1715]: time="2026-01-15T01:26:01.650203193Z" level=info msg="StartContainer for \"bef2d2b87576fdc01870f2d478cd2ffd7661e9eefc5c4779bdf0e7117fbd7951\" returns successfully" Jan 15 01:26:02.173998 kubelet[3275]: E0115 01:26:02.173200 3275 event.go:359] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.7.78:54726->10.0.7.18:2379: read: connection timed out" event="&Event{ObjectMeta:{calico-kube-controllers-9c45d7c9c-l7rhq.188ac2d8be73d929 calico-system 1877 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:calico-system,Name:calico-kube-controllers-9c45d7c9c-l7rhq,UID:0d14115d-26fb-4eac-a6b9-b5aa96406bb8,APIVersion:v1,ResourceVersion:782,FieldPath:spec.containers{calico-kube-controllers},},Reason:BackOff,Message:Back-off pulling image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\",Source:EventSource{Component:kubelet,Host:ci-4515-1-0-n-d76f075714,},FirstTimestamp:2026-01-15 01:20:39 +0000 UTC,LastTimestamp:2026-01-15 01:25:51.691348812 +0000 UTC m=+358.114789435,Count:20,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4515-1-0-n-d76f075714,}" Jan 15 01:26:02.692469 kubelet[3275]: E0115 01:26:02.692422 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-9c45d7c9c-l7rhq" podUID="0d14115d-26fb-4eac-a6b9-b5aa96406bb8" Jan 15 01:26:04.691390 kubelet[3275]: E0115 01:26:04.691334 3275 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5b8cdf5dcc-grnmf" podUID="fc1c990f-1003-460d-a72d-34a2a5fb4d83" Jan 15 01:26:06.156033 systemd[1]: cri-containerd-e1a9a69f281c4939cb444ed1bfc85b218a064c52a9c47a646377d7c719b09116.scope: Deactivated successfully. Jan 15 01:26:06.157000 audit: BPF prog-id=267 op=LOAD Jan 15 01:26:06.157177 systemd[1]: cri-containerd-e1a9a69f281c4939cb444ed1bfc85b218a064c52a9c47a646377d7c719b09116.scope: Consumed 3.579s CPU time, 24.8M memory peak, 324K read from disk. Jan 15 01:26:06.159196 kernel: kauditd_printk_skb: 40 callbacks suppressed Jan 15 01:26:06.159250 kernel: audit: type=1334 audit(1768440366.157:766): prog-id=267 op=LOAD Jan 15 01:26:06.157000 audit: BPF prog-id=93 op=UNLOAD Jan 15 01:26:06.161253 containerd[1715]: time="2026-01-15T01:26:06.161142493Z" level=info msg="received container exit event container_id:\"e1a9a69f281c4939cb444ed1bfc85b218a064c52a9c47a646377d7c719b09116\" id:\"e1a9a69f281c4939cb444ed1bfc85b218a064c52a9c47a646377d7c719b09116\" pid:3136 exit_status:1 exited_at:{seconds:1768440366 nanos:159289012}" Jan 15 01:26:06.161980 kernel: audit: type=1334 audit(1768440366.157:767): prog-id=93 op=UNLOAD Jan 15 01:26:06.162027 kernel: audit: type=1334 audit(1768440366.160:768): prog-id=108 op=UNLOAD Jan 15 01:26:06.160000 audit: BPF prog-id=108 op=UNLOAD Jan 15 01:26:06.160000 audit: BPF prog-id=112 op=UNLOAD Jan 15 01:26:06.165031 kernel: audit: type=1334 audit(1768440366.160:769): prog-id=112 op=UNLOAD Jan 15 01:26:06.182944 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-e1a9a69f281c4939cb444ed1bfc85b218a064c52a9c47a646377d7c719b09116-rootfs.mount: Deactivated successfully. Jan 15 01:26:06.553416 kubelet[3275]: I0115 01:26:06.553390 3275 scope.go:117] "RemoveContainer" containerID="e1a9a69f281c4939cb444ed1bfc85b218a064c52a9c47a646377d7c719b09116" Jan 15 01:26:06.556118 containerd[1715]: time="2026-01-15T01:26:06.556053874Z" level=info msg="CreateContainer within sandbox \"791331ecc3fbcfd3a419281927a10fa61fb8ebab84bb774fe5593cd2d9845502\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Jan 15 01:26:06.567563 containerd[1715]: time="2026-01-15T01:26:06.567526100Z" level=info msg="Container e99027bcdad9e956fa29c6d672546126fdda7d869a9cbc36d01bc8d0472af50e: CDI devices from CRI Config.CDIDevices: []" Jan 15 01:26:06.581190 containerd[1715]: time="2026-01-15T01:26:06.581149135Z" level=info msg="CreateContainer within sandbox \"791331ecc3fbcfd3a419281927a10fa61fb8ebab84bb774fe5593cd2d9845502\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"e99027bcdad9e956fa29c6d672546126fdda7d869a9cbc36d01bc8d0472af50e\"" Jan 15 01:26:06.582039 containerd[1715]: time="2026-01-15T01:26:06.581567028Z" level=info msg="StartContainer for \"e99027bcdad9e956fa29c6d672546126fdda7d869a9cbc36d01bc8d0472af50e\"" Jan 15 01:26:06.582516 containerd[1715]: time="2026-01-15T01:26:06.582492531Z" level=info msg="connecting to shim e99027bcdad9e956fa29c6d672546126fdda7d869a9cbc36d01bc8d0472af50e" address="unix:///run/containerd/s/0b0c092080c89c2f3d2d000bf1ff3ae60a137856226a0957d93f90b870a6935b" protocol=ttrpc version=3 Jan 15 01:26:06.608383 systemd[1]: Started cri-containerd-e99027bcdad9e956fa29c6d672546126fdda7d869a9cbc36d01bc8d0472af50e.scope - libcontainer container e99027bcdad9e956fa29c6d672546126fdda7d869a9cbc36d01bc8d0472af50e. Jan 15 01:26:06.619000 audit: BPF prog-id=268 op=LOAD Jan 15 01:26:06.622032 kernel: audit: type=1334 audit(1768440366.619:770): prog-id=268 op=LOAD Jan 15 01:26:06.620000 audit: BPF prog-id=269 op=LOAD Jan 15 01:26:06.620000 audit[5812]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000138238 a2=98 a3=0 items=0 ppid=2974 pid=5812 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:26:06.624486 kernel: audit: type=1334 audit(1768440366.620:771): prog-id=269 op=LOAD Jan 15 01:26:06.624542 kernel: audit: type=1300 audit(1768440366.620:771): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000138238 a2=98 a3=0 items=0 ppid=2974 pid=5812 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:26:06.620000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6539393032376263646164396539353666613239633664363732353436 Jan 15 01:26:06.628414 kernel: audit: type=1327 audit(1768440366.620:771): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6539393032376263646164396539353666613239633664363732353436 Jan 15 01:26:06.620000 audit: BPF prog-id=269 op=UNLOAD Jan 15 01:26:06.620000 audit[5812]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2974 pid=5812 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:26:06.634022 kernel: audit: type=1334 audit(1768440366.620:772): prog-id=269 op=UNLOAD Jan 15 01:26:06.634059 kernel: audit: type=1300 audit(1768440366.620:772): arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2974 pid=5812 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:26:06.620000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6539393032376263646164396539353666613239633664363732353436 Jan 15 01:26:06.620000 audit: BPF prog-id=270 op=LOAD Jan 15 01:26:06.620000 audit[5812]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000138488 a2=98 a3=0 items=0 ppid=2974 pid=5812 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:26:06.620000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6539393032376263646164396539353666613239633664363732353436 Jan 15 01:26:06.620000 audit: BPF prog-id=271 op=LOAD Jan 15 01:26:06.620000 audit[5812]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000138218 a2=98 a3=0 items=0 ppid=2974 pid=5812 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:26:06.620000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6539393032376263646164396539353666613239633664363732353436 Jan 15 01:26:06.620000 audit: BPF prog-id=271 op=UNLOAD Jan 15 01:26:06.620000 audit[5812]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2974 pid=5812 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:26:06.620000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6539393032376263646164396539353666613239633664363732353436 Jan 15 01:26:06.620000 audit: BPF prog-id=270 op=UNLOAD Jan 15 01:26:06.620000 audit[5812]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2974 pid=5812 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:26:06.620000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6539393032376263646164396539353666613239633664363732353436 Jan 15 01:26:06.620000 audit: BPF prog-id=272 op=LOAD Jan 15 01:26:06.620000 audit[5812]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001386e8 a2=98 a3=0 items=0 ppid=2974 pid=5812 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:26:06.620000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6539393032376263646164396539353666613239633664363732353436 Jan 15 01:26:06.669007 containerd[1715]: time="2026-01-15T01:26:06.668976217Z" level=info msg="StartContainer for \"e99027bcdad9e956fa29c6d672546126fdda7d869a9cbc36d01bc8d0472af50e\" returns successfully" Jan 15 01:26:08.765545 kubelet[3275]: E0115 01:26:08.765007 3275 kubelet_node_status.go:548] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"NetworkUnavailable\\\"},{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-15T01:26:00Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-15T01:26:00Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-15T01:26:00Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-15T01:26:00Z\\\",\\\"type\\\":\\\"Ready\\\"}]}}\" for node \"ci-4515-1-0-n-d76f075714\": rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.7.78:54830->10.0.7.18:2379: read: connection timed out" Jan 15 01:26:10.639201 kubelet[3275]: E0115 01:26:10.638518 3275 controller.go:195] "Failed to update lease" err="Put \"https://10.0.7.78:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4515-1-0-n-d76f075714?timeout=10s\": context deadline exceeded" Jan 15 01:26:11.005323 kubelet[3275]: I0115 01:26:11.004909 3275 status_manager.go:890] "Failed to get status for pod" podUID="c12e4e6a98d84f27efaba5c27ae424e3" pod="kube-system/kube-controller-manager-ci-4515-1-0-n-d76f075714" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.7.78:54836->10.0.7.18:2379: read: connection timed out"